var/home/core/zuul-output/0000755000175000017500000000000015136551330014527 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015136563135015501 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000334513115136563047020272 0ustar corecore'zikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD R"mv?_eGbuuțx{w7ݭ7֫~𒆷7̗8zTY\].f}嗷ovϷw_>on3cvX~egQBeH,nWb m/m}*L~AzHev_uαHJ2E$(Ͽ|/+k*z>p R⥑gF)49)(oՈ7_k0m^p9PneQn͂YEeeɹ ^ʙ|ʕ0MۂAraZR}@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߖ|Hp(-J C?>:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qf]ɻ_\x"gGO%k3Ɵ>Nz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓImjX( vb_DSVLk$QpgVE8ɩ-[P!zv鷧am.Qz!s_*\bfM)*/LZ ޹1qoj9Cw; (Zʳհ!ڀ6A֒`rH&C|@$XXW1/ 4sCd0!ֽ9mXk2@mH%$ ikdNYCgkxJCDA+ytx؞UT!D' !Se%Ӭ". `2WE4fջ01N\=CoSu1ۥ,j|x9귩E{x=j#v1 "x }h=|_Lc x'KAW$a"wA,#iM9LCGº,4:7e9cHWxC9*pAG kBc6y' "M575"cY{RJ'ÚbB>*`yCBA6Z "k+՞ ~t-`:nR</r`kC%X$axўfd>$а/W.o0F)!$Sn'cй)ZwM8ikk*T|ܙ#Ã]9K{;cP k ݊14'{ 1djjGnO擙:xApБM:Rʠ&S.yׇ|~5ܞ8*M=BfsXӞu9Q9f Yu5S:ªXeF= qd^pEgȒwiuzz;Kx8@9d5=\HWR)H ܪ/@{i &ڥ "&xB8m钄%$IqgzV5$1n Am/יWKY81d7U'FYFD0wACW\vaAЀ=j 8&f I.{7OV#pցx CU0OGF;2 @{`=BP@+`w ˨5\Jfdr?y8J)3xhǦƷ A5xN =k$*jMc߾i~^Y{ZX{Ԭ*{FsJDXi|EamG]Ѽaks[hοNTvP{ @t@mHXCq!ﶎ ̋c@;YHYbyһhV'?|8}htwȳϚWG%=ƃJ7 yJ~ฤZr=%R2Ë-X? kU XۉTx7@{j0| d 6HFQs2c@E Bb*w8!5%dm*93H!;û2kTdm q!yÐKRAvM,5]=?b78@9tJMw>ݩٯJCY r\/4i;Ą^U .xzspzoFSB1[a3@4 #:f\XrE*VxHFo&1?Hi\InOQ,}lO&E]d>'ŨՌHѦAn}5o"Rzy2tޞܹM{L+RC`maHH^N?ӻ0t9yNO 1!=7dIzn1RwrN'$kw" qБ^GJAKK1{Ռn 9QPXl:R3;L׊j"0/I+b#uّ*Y]}Kѳ(jr;<`0tZkL7%cهuD/!#kh71yIt=8=ߍP Dz0D 8 %}߻|I9^䛊;ssG-=ԧDm#YdY(\>}OpbmJRw7Rt9JXoS{0u{]mm;Qڣu ,%k,:-65c4bI?-]?p_e+,k"$*Z^ZfmОo$. `,كsC`!1@PEiSF-Gno.14g>k~Mr`F+v|wb.%anA!W'go۷8jG?ѻSo{c ]56ոeV ٸ̑šƜb?7i x'Yozi:nqJ(1p ͦ sۢ^|:M xwk>w[; /RheX{EZ.֟JY^-CD# oyX*Z`M-$_Ѐq0&*8!%%gQyG~!1H&ϺHI9Ŝ#kv BX9J\H"y g*f/C{3@F6:T4Zm&K K<[Sta}#Hjkw(KƏI]p|nC$@W,w qՠIH°A֢!.1Omyx0::]9} 6_~-&A SHR֒9zn#»CE +s2yPH& P1dR߀']y^F=g_{~`5s_^м^ oڷc,vT<09쎊X5"}WI2IɝRBE]BZ#Th\hLЩl02T2?g|RVVV@\kžڃ+5{8ȸݕxN1mU`7?dOaG!F%F}f15鿯wK*J2{W$Q˖jc>DAb@=yR%92HeMc@ЍgCPd0"c^d-=y,^։_T2g[Cu6C 0TtIU^oN6!;)Ʀ(R"7i 2Ϥa՗Q?z#O M崛, 1R1qJB0oXJP)@V$΃">n?GgӋ mc9U$e 8gk#Y+JAhn+ӗӋ.]!|&'@x7Bij5@RU#;J_Swj7I\md`^ BRcSD$o)c/s Z|\䛧 z>9![>'b]zԴ¶"J)N@ qspE%|F}C捬Y$kqW >&U'm˖O$A(vK* [RU_]RŬ4"ӘgS* P x!k/Ws7՝ }ҀR*S_siwyug|Ki?D饿/2l_L35y}^^XbDKYD>EoK蛨A釤F?hJT{yQ0cWlzؔKMM½|ׯߤ-vU`o}e{U Y |(vՖ݉_Pg_Y/D.^!jtDrilFC17|-b*\> o#>n-i3,1ޙ(m4}!^@@$޼`'SO>LT%n;yc! '3|d^rIaJ+U'Ed[1Vڄ[iUgMN[8>V#ہ=ll[/ؽiA(|Xpn2}nr렇j!%(Ȋqm|d1ƕ[m6F+F(g +YtLg }n}.8BKą5bR!eZ`|p8 81,w:$qU>Ѫg 9[g=OFV?SATu_'t $*}aQ.Zi~GXdm$UPI8>^;Y. ,BLq~z&0>n- ,BLq5x9y>924~ '1zg7.,^>Y`bUvK S x1Qz~%.k=LMF1lE"} Q\E)}\"X.2< 5FB΢u-`aJ#T+aI`"Ǒm O\B!,ZDbjKP%9vk/*iǘP,:4F%_"zJRQBk Xu|`ot)GS'd oEC>N:S Px∞*ޯrޔ3>Uf1w;mCj!Ѧ/V{r&ȑ|X!|)NuUibap5k_J'BKhxzTr؟˰ ]R h! n˱>5S1p،Q|=;9{`umy_ن@a0(>6%D1V}0W@o  K[{d+ ʿ2a>e{KK{&0ݙAOu }nH6Ͷ+.4_x 'I~|?yj8T9GLT!l<<+ƾtVlnEUdn6UˇjJU ;V`JUݵ>EOڴpu{l?oEեj'x8&? ՃJ5sB!8 H"ssTVzt&sLq>f@JM-Z 6Sؘ+SnsUf4B~ku#QHW! ,/e5 ޯ$Tί :7u6o{mj|L"}C6)hg]nWPVZA#oAZ:ݒ[J>C\l9"و- |yր|/#.w0ޓx!khD?qO`+sX$8V'{d\+ qR9~ i±za+Jtli>. EWz:V^mً 1Gȩ f,( =X?D5Ir񾅶X/SR?|a|>bX9t>sgvO"4z,rK#r?NS)`Őr\+!1:ViI[/ѩ آSME(ȊD1ca$j1M~# (C>ϭh'ߤ%E8?P%ٱm4[$++P1;ڇ6UE@Uo-^׵O;ӇKaRˠ] ?Cֲq9H] bX9^wCο -nd+pHU')X<seW);W= 2.Kfs%(DA M.1%[vat/ Px"3Qc ?$ PW*3'X liWv-6W&]2ǂOv JpU1%8Gk@5^N&*ߗEЍ0.U#X]Ħ@x؂K0c@nyUMؒtqo%:WQ!%8j0dUo2=rh>*YȴU Q,͸*E4_%&U0SUunͦY;c؍UXE S/8 7:" Q E[؂e@"5FqUA'58?DX q}ll|U 䗴ڤ0-3wt^ڇVȦS!e^dW:vX˨Plp;I ]f!A/.V'4cbpWaPBIpqS<tQԽGf"8.h>hBXUțTOW1-( \X%plL-weF%0Ǚ'0= *^Witu"U R+{{^s1V63 ۗI"*al NJ`QFr!:8UL | 8 !#tijknnJ0ܽfve=}Or~Mу0c5thC\Xd\g2J1(Rv%}Ε UZ<aۗm"SK [h]ר|r _;ǖڽc?ߖ]G@Ֆ;UDe1 g3Jە288çco#rctKӫce+SxMK-d>t.U8ǖ\*VK=&l3{yj|L"}CIMR{uq^^?w㒇,{{Swz*{Q=^UZTAjSت-ks}oE|ww#꾸x󵨮ˇV<*DOw]y*4݀wn|l+\-~VZpsqJ5 3 |}44)^q~,󿴛} 3+7m4e4AlX-؛fS1UܦI`z9$؆LT!u_z>q%g3JQqåiw@]?Kk.Oq,*kI`eއE)J:|-=͋A,z`s,J|$t{9`^>th% ػ޶U7g 4݇ߤfnFEh$鏔dIcmj1p8rĘC!1Hݔo)U%Rr.ƣU6g#Γ+$1ZHIRQ4I|ɖY{ jSe6ޕsǪnǪ OH` Q(4xGET)T"DZ/HaQ2hʪޜ73,T'SmXfx!3?Z5H2'Gk)Xb7y@NՌ/ bqx?XUᶙZ:ɣtt9X_ϫ AO=… Rӥ1>gQ0Jg:ьiQG/iB$xHnɽsJ vI={%e+_a~lqi|o4|2ݗ؋P-G/+b5 ;zy#C,,vl/PMÏV`yZGv+uZ3m@ׅ1;-=dnp8MbɫE376Cfz}Aaƾ+cAd'JD2 ׋D/ =3Pl{ }g5Z+JJ双}fĶ lOٯ)O4m͚V&D2E^!I$ `"z5cT<*'B6)OLP˄Oꅘ֤*D^_Gt꨷,ɅE]ݍdXj^]Uv`,|Kr5+o&,'$ɱJ-gKieM!N){\BV~YWfh`6ުd\O^t)IS`)21I _nuw~uVh]'`p@asS.]4oyx\8_P0JL''yO` ŊQ o& %tWs4'_RvpJ "ybA#z&0++9JTs=wrw਍sqѲ'M+ m Ex[& V)%90}:ߎ|iN%<:Sxúl[$v6ܰ} aq'( g.ί ˡ}|Α<6OΫ_evH ܿtgE2o7ʀ` ݍQ%Hd>'2 Q_ɜ KXMj0WJE+${wps|ߗ죍lN*/}Vdy5 cP=F*"Pr#?qdmv ̺ ,P3u lzӰtI 0`IHg$0o$ mt|)hkT*㶡il,B xW4yC慶~[= 2NdS};2wz3S nI842wA>]EvIH,kww^F_BD$N6=x/] YdϏnV!p(2Р쮶&T2m鸫t!¦4vm>%ސ{{vA|܅)%is};jjar: ox$(`wLqx!«-qȣidI8 AQ<}#y)nP޽h\hfXѰ.ffi-Y Fm빌VK:CeۄsX&Z+Mئ7^嶭s&*E7H@!?rr,޽@.2$_&ΚMkvEewvP_ q5rujћ$m1 3R6YKwIk*|dL1ho*Qw'>b4eH[7 ;yTMDC1NYpHPݥr么fl0E] dXDC:%4~R1w'sz4] 1,ɋ}}/* <xނ:=\"9 g]w|^!{뻣^O4GRB-4"Қa5 \淙l* QL?J($u(-ċ+ZdI}x`DY[L8zW"׮E iEm@k13+ͨSH`nG5mS4_=)ъ!uwGXT8BGm$jb̊R3 +<ԝJG2.j(U _7vgÍC泻gD4qjFI >muh +A2&joEQ8< b Jo3 Bi:&H[0inCy*jE9k`ֶ$f~ω6:CJJݗyL*Ynl\iT "v0H=n/z4#ϼ~ɁgaW᯶ v"_!r!f4ʅKgVj؎3[4ZW?_`=uq!Nd2p]29$kxϥeElO2]8QܖB"S5^װ"?P5*CD0:~o;[dzn ~zH;VdC*,C尷n<ؐ5@O/,Ds|W%:ASVV|O߆?zTR6L` GyևDx|4c麹%1tm-u"u-'kޒnZ]Jzީ)O7-a`*Ea؃aQ>Ij pXj2 tJk۵,SLADsbW <f qCy[;[h EZ 1Q > _Y0HgK^C9OZRLڶ/̯ƨgO"Z6硶H ^]"k©GjGwU.HP]PDaOZz bE<6E2Ўz[I-tmA-hQ+ ԾAp1kK\lۂڪ(U|zoHu5uaVx4p "j]ʳjox#!/xԷZRɴdwTyVL79ULk*x{e$MHQżVÓ`).EmoI-#_Ò4Bg)(җ\O[f 1TۤϚ>y\[6~e}on[?+ݻkyP@f4[ +=_K'tMj} _= E}=dxߧ{߻A ܋{~ V_Yj@a58nr=߷.@a{RJg: GUNhH2X1@}L=\0= HktڏfQJLvKZ߱g{ҭ鞹'}j7m}j x龱/m~ 9ۅ 2}c ٗ0=P; 3L}[PFgO2 z t>ߝҨ{ҿ>UЇ h*@풌 Z-8'ߘAkz Og^BeW4yC0 ;DTvֻ|NSdBtȶdwiۚO(bjڔ2pkPlz2ӰQeB8Bkć!ķm+__( ob5'] b6%%fȖ]#=☜\oEn7 =ppoc)3&f<;8> 3W*qyedv$)mf]0:?4-S|CARo9ܱp%1y{V=€|\eRUT UQ^%3+ta7\=2YkD?Xק2k?F_e"*Sw$@p)Dk5l/4t .C .fxU5IH;k&I'J)F YjruDVj/XHE~x{gϣ>>pp*x[9zJ2ԞqX2u@@ ;s铹E}2Z8L.w@w0\o*.t88؆X~hJ~?nAYy v p`PI @ Vna/AH|k<N;u(v'GPsրq6cX@f'8^95ɪ>4=5b̅Ka?\͂(09w 8X_ȹ[;b87 ss{ ._1E><P=c?FLѸ 6'*Xob P Dwg޽aP%SoHo֏ (},;z T8)}z礌'+% A@/~&`P~$]łR@c .^LZD(&. `ge1Pi}QNj;aFJ%مXPrsn Gvr P ݦ> ۛD w~ yXӓbwlGpt 5,%mSE=pa 1;L}R=ќx&&j@qg>'+zd`5|"[2ٳ, N*CS/ ȝaPޓ> $f =IP bsв"|(6G2h#M!.qdBߙpJ' }!_}sPaO|^w)B~w9(XatzD=nޣ@cccYY|?#0 >D= ( #k@wEw|oq4~=O`]?_KDa2y(,:0RF j+d wL/533HdH[G8hupTJUQC pKd RV27I}c1M.ֹPM#TO.y 7iGk@ծBq[Y-;`~}s|ʋdu]W6=kIu|-\dABf_}"oDfSIePJ`seK&F>YCf0\vØ9y5\Dz}|ɊI3F t1#p8_p 4yfIػd㧟˘vGq8JKFjY"LJ•PK`*MLs\M^}u;kbBeF2imӼ[7₥*IMf=).)EUyrvv t~$vv#''a6J{A@+O6&& &=M,`GP' UWp }vf8HA|}YU}"~|qA1Qѩ^qtp ӪAqCwE[`#./G帊w*f 淟_~9}Cŗ-oXʡ. .PձXYHY4L]w̼~6G'ATƹДNk R;|7#' 8N?#y(gٙynWQӇH>X k3qnec|74F'pY GL /7TMRT㻰"p ˪[I47W~+Xլ ̶ì;VŝVV|dRxncx)y_擘Akˠ^޺wyʞT9AtN$+8omlPlsBe[6!֐MƔm1쑣ʶVѸ؍Fv}n=._o@,ߜV8R uV u6 ٜPq:[l@J愺#ݒPwBUB 6'{ޖzo@9$߀``B Gh%nNh8B- 7 4Z%4ڀhsBmIh%]8G,),+ti@euUTwxg4Q>]Jw^VpqBˢjq\j1' Cx'Fzc ;iLKaG}e^K"u~) jH| ewo]h>'ѹ;sșyRyi˟k$`VMNZ\e>=rFϦN ?>$w_<1U'dva`hEtBpVW 5Ȟ<~qO\3y\zm1Ϳ S-GS7F72Dify*%F;D~ imN'hFmh4EJDy:' !ܲa*YVlN녥Iz5A3)| 5 PL[)|)VlGQb9" ,]_N--k"箔£uE_궻r̓ U>Nc6p {e&{Y"pr\i\pw֬D?Vr1+)B"u2 fr͏pdWA*?Wx0zϣZܨSt8>Mx<=O9t3b6c ౖI+ I\xr//#p<@{p/J;zavER5/@Uۿ5.E} K?Oߓ}[K H϶HKl{:6EXN88JrY_ jI26`% Ǩ?t)B0kK|/BpVZЎpU׶aL10ol i2] Em!y@ )rܘa;MFwAI]4iSѲ{ɪ*3k3CI dEnOp YyfIܮ1$4LvplyLܵ i hM ui 豴J́DYt &)>nAMȚ6ڷ6 !ULJYDAZʧEg:JFI,w\K|.gB rqKb\]%K$<[Ũ EQm1h~|+2ƀS[ DYy DcERX❈B'fɞ(h_evWeWmĊR9l0s6--a% ƛS%QLx2vͣG81u6N)a~LvK~o-G9F^;8yU|QW]WUjOCXuf#{Y"W# m\Zo9&biym8Z4:,>YN9fkwJ!nTdIeܾEi+z*ʥ3`ȉn]̖m+*RLbg X{%MV3Zy83ZB\*_H*a_2ÙqulCCr-Ksm5S>P뻣TMdrc&l.ӦɋR0Ч|T+AT@kAJ:+JA&Ŕ]/75[t;JOì!z^&=eb&nջlB $T.@ɃC 8 V$z]l9ޅh[$Wktg=YX+RUӆ,^R|/|}zu3I1gpPd+ H&QLi\ Dh@1_4Ist )[dudx(:2Uw8"׿i*{tˏlWO9o*{/qwq 5:u}u{P^Yiӓ81.ΉgTPke Bg^>W?~7v7#Ʈk'NZ˾̓3 ^oTjz fW%ŋ;΂,'eJ!|.nn;E-:zUzZr# -4X8hg(~񒪍J8maktly SŠ1w0X7FǞOx{pj`(51d1i7M[34#x^¥e X653`VǒjMYK~Yb?6 EQ$pRy' ,9kX[RҲ"PĈa`o{@1I \5+MAetX4DF1ׂO|$97zs*󠣰,T]FXl~2WfHd*rta,vA&2\^)2obd*/n3 wIY7RIέnV ?39/mf~T!#i{!V ÛǮ8Z/ T 1/-ikɊ{J;?8z0@@@Gו(W36cIgFew}&1Yyr>_tV k3ZU0* .|Uԡ0bSM7d6q=6SoVi YpcUe6'X#|}}(f[-묥5ppE^:6cI )%{9R2 ^\kR10vYgMmQגcQMp2gp}Aہ29ЩKoX]C_geGN~ ʔkP8:ƻ|ATQLzsn*9V[[m6 ҈'C$0̧DEJ#Y46R|Fcɞ9l")YڔaEϒ(-}ts!p#2e`By`ɤRn<IluX}٫!xNb11x%BA֪b2XZWBHydX Xk2if}PLYl1Gpe*!P;^\ec{IKs7>( p3KcZLS;FN&#̄ȧ|Cc5<Ԡ] 3|UٖP]1x7VA5/OS|~$ь=<]psQ{[dip68X82튑rh1ت\z; ͥe3.{{u7(p;Q|͛zǖD`siQ?WT@04KZ+S (Db!olK>wJ #rkvػHK[&jLpyIap5g(@BԑR+TWÏ|@rCQ z3o]x4dIF8`$UBCFM |a3Juve"SuiMAE"X!B*bc`f0bg+7-KoY-ʹ FU#Gh{յ\lj%!Rh2,s&7Hbw1>IO993 jSԨ w~RU=+x--9t<}8G`[eJȔ 3n?L_Z1ѸҴǒky:6$8ڱ8)MUΟ´%-x7M+ H1j0˥JԇP*3Z b4U6AwscՅ?0F FH:;nҭ$qp ~(WvйNtP\;}|N03͌䴠8pϦ3Eۖ&\Z;*ԝu/DJ&'Ql;G6v=ሢw}nkU`|/xDx;xTy,Nd`&LLEȓpvd{.<Ŏf΄b)Hg,wzpDW[6NODz<$bb?4i_臗A!{N&,֢]_ GQVC}^cg& V!3c4d2-75x} $32 ͊tCABJo}"gr'QhO>0:kPiMGVj1$($Uo0g`*?%]9~c|X$PU{7AexŔ5̓ToIO f4]$o0֥1^r4EM޴9eƎimgoB+CbOD+l!$h;$8=##z{fGT/m9ꓡMk9 iwDv@thVnpGH:M]onIpT=¶lPgAU=}җUD)6_\?.̳.̊Tt}nIpY'x۞)ڷS=~eXqh [)X>K\1yi[xe"$^d}ʄ Craᖞ^̪U}@cZ'`fղp\{W/a5xƈ?YqvvIp*Ҧxiᬭ_ ݂B[ZZ+Lc>Ιps)NM&(ىɛT qu{/]6dpj0ftQF$q\pJ΂(:c[E퉑tmwސ8ݞͽݞtN9{PfM0qYf/-2KRoh9"KH9Jgc`X2FҙYHp<=0oeX. ]ƚ vʳZ= 3Q8%#х%EN~ʻNʻ8.$8Ng.!npK`naJw+B/ym;EcRǒ8xve2+[\?-[W361ޚyG;m1'Qbr9sygeO B$gg_ۄ>Y b΅= [U7ΔS;1[ CdC∐tc\nHpL"O|7:>.U13k8%1((rpƥfȫ"9_mvW'q Hor?d zpXm?c|M;gc&EJ$Z-KEh LI5dO̫w.).fzߌr:Y=?=8 pֻ{čchxW^:bh{V:vg.yP=4}ڭHp̾3ؽ=;N]&Zԡ >@ʛ݂8 }=;v3S"3 *{γK"M0¦8_q&D*E%Oi²PuaaBY,#%Q^xfO%Ӗgl8W&l߯LƲȻ;6 Gvh۾:~CXUl;jf.웍c݈ т#m7tj8pnd>9:ɤ6d$"F%gOνU3@դ4B~\Ļ1 voClβI9%x vT‚3ey:k:2K:giIedCz>qH2np+ԥGۂ۠#2 /'051/=/SP#hPHf{:3.4E}ZaZ!+tIӚGgENyQTPC^HQ a؆cثCïPE+ez7bo\tAyK=]L'o/嚈p?` wZgX 5Nz2t:< =thVyԂ[)GͿ$^Gm$q6+&C_O\!Ii mJ?E=* }ܔ''Uh x?% lQReW(ыt-ݲwuh|0-dž^_S2ão@ !ش^yz 5N?#ڏ|lznVN ('ü#뇣̄3?"P; X5 4$1Q;do]ǃQ0Ycr-0цѷ8B_e'C#j'I2 o>[O=coi 1W2?L_uɛ%k`l?ܕ?lk߂{ Ds4@AuLȌS)8ŝ^z^-{H%~Z3c4<@st/}B`Tz# 1~-s,R3s\`z aҷGyĮ%.+ 8 m.`=*+d$8|YJ.K=Dv2o(ٰߍl⧌>>y+,<-J1Aw,UX,23LK ]03zgƦ@#A?SuE$`? "6S^JpEVW+-ǹ1X] \TA14;7TMyd tʰo]IXX &~֚~8I|<!fT} f_ E׵X W|Er8`ץڕM© J,\8ؕ2D+kKߥD;U~횒G%E̟6xdˈ`D5?L;9i%bpt[?fl5 дjZUsF~*aY82l,u8gKGb}u^Voՙ3ʘAT ͙ 2єy*V+娲rHGGG0#AD ͤ`҃/71q@$uu3ѨQT!3^OPP 7JBMo6h@asȆcϭ#+ #@7@\|nKFhI@+˾6.PQG~H`Fq\)s&A܀MPZ6a:O$0jBfIl!Us&&BTP A$ )&$ dL~5)'7h1EH)I LM,I|ANA>| $oF>h/w:ЇR:bUۏR3X_쓽P؏Ɵp~J]Fz嬱BU/ 1Ao]~ +7EBx󧷓PM|߻ hj5- /wpt~(J:$jG=QgVH-Sm0q#q۟7 ⱡӓ²;k=`-uX-0qh{}28c) +ý:,!(R=(ߞw]gL`jD$fs/E\*N<(_!vɒ%ۚ DŮwu:j#ṮXG/N^Zv{鱣4Ez7ʿz8b2S^hoSXfqsC:J,=-R<-30_`|cDXK`evg[?sB( {7YcEk2B `M'jAhmg)зg1f[dކ1r1VhM2G'mNK= b/N]^^m `"Χ1Yx6(q1 l̢L D+`@ar!k[dyW븾L#b1}wv:}տȤ=JY3SLCXmʏ &V~C}y٭5u'" =BwG^j*t,Z"Vki KO,p+K mܗE 3 QT=7C(pťLTаjw~XAڻWV*|z~t=mD',;uroO'Q>!l.88FHz˄xLU"H4#Vk3YfF&v F}*s#;ߠnCq ;C,,-8Sju,i4nfhLwG39U2SDcڣ9R)  : R )8WV;NT|FoxeͨP-ͨ5s,hƲRfhYF<hT j%NR!1&4EN %[Y36TK36^C3NiƒyfrlRo!9?uB@p.e)2 FIz_93fmfmfrO䫚Qc9d`hb3{GE: dr0άuup-%fidkfbd?.Wkɀ~D>z'ްs=ioYcOFA̕|ޓ~jE hy(@+E҆ 0m'e0tptvμ,;VݤcFvX{EV!aqGvn>>}5M6GW)!w5\@'Ƭ@^mpWBᯧН%G3 Z MG7,uԆ˃_횆Jo] 6b^0X`YAgrMX5WV< ? &I @_֡ ^E6@zFoAƤ%ZB]JJ98".0eUI>J"b}ۜM?%éqf}ڄY%+QD!:G9'¸x3`uKm`尟?2~))oXIC{ݳ?|M ~ʔ]d\$}jAL^Zk-8x:)(#'`}Hə fveZwfb~%Ww|\ |\s~.2y MH>w5,<v.>Lbqob+~p ~fgRtbQ~Jg*$ٰ߽)s :nчWەwn'k !Uy65NPs6a&dfudBL!geSaA¦$E #TNVk:+Rird!ȝx ʧ\2KTi&4gH!UPş*otx$l4s.09Wtuy }pYRM+ZCo@ruzV/PRr>ӓWpJ>P~k2di~kf%gfƆj]U\'<Je A]v%>YqL^g ~9y\vA)N?AoiY{7_g(=3g8jFe){?c+W?~ng :0J? tiPu]9:l1C{wt ofQMòmWyvMgXbWďc{E= vػ;?Go 7N{ >^˜h~@y_^n%LTQ뻿qwROz$k]Z ].ܥh>.h#~ k頋^gB+MMp(B5~. +3/=qFQiHHQ cLjP*%s8UfBXn"wxscS3جvZd1"baxtj1>ռ&> mHhxlLR%R-?{W۶¸Z\)ܻ^ $)8ܥ-X\QrovIIl٢(bԵoc ^'1pݒT4OdCAj΂`-2fo$Xvkol XZ)d#m fF`KQHuD!]朥H)SQ fjFM1/%_*x,sXHN8US5RS 8 hR(VO`nnO0@;єdb-7gF86A΃&IdxI%xhŒ2)% _*'L0gFĄAXCӊ6{Z SLQ I |8 $f$`Rф# K0e2mzk;r޳Vqf[ּ;N34Z%# _dHz/IU"e$~gY /1Z%L9C&32eSJP:d|`nɗa)jܝcg-Ғ7aid<wfyr%rvІجR_1Hs1$߬;ǗΌfwG8X#&"u9K.?BROxStba&VnTߝo`n/D!Xֻf1F5$e"P;kiB,wkoxG uÄZS${Vvڟq; cf`r0NYJ 8Y!3LV~[PZ$ze=9E#2:zX=BwJ%z1g39MXP1’viNd0̜IA mNiƚvHp{[@6P [vOj-a*08œHOPS]4$1R"L8e gED[t=2Xw'Ke&IeS.2)%6[qd PbKP$K4%*BYwDSCPAZLe,M ťSYJ5JL)bu{`(2'N!`HCVa:mI^RdjAR;&)UIj3"I-Xf3TffLQ`rɚ 1mвK ) ZJ \StG)Ah"EhO"rkv|Sϱq~D)!鳢I9<&Rj.M:2jRCdSfU N8\SGTWL 2<}2)&]Z: \.i@C 6 ;i.h%Or *';N~Jե+ޝɕ+C Մ"ۑ\Y1'MH .h%GJ͂Zd*%:@&`c-ߓEj? Y=5]iWAhU}6 kaj!Ku];/c70AcODubsxHf>یdu8D0HE_,v/foՄ~KF߽hXzr,b-CeMjtO8ػWMzY3|=\K>.Vu%d~*} m]]tSmgFG;|XO00 W`ЪG|Krv5M%:d~؊ZV#W#/vVx ;La1hOsfm(rG%)'ڪ~}}| c{ QZVﶮm!Rlݠ"#]1j߮SKbWR CHU!x[*? |~}mwaeA}6Agܳ,v Kx&L B`O ;F?ќnnZPB۲u֭g=[? Z\EֳRC9z]25;AH9:%z ;30a@).zD{r7b;pd f(\)E= UeۭR[2CPRGKm镳f:9r#58kfoH -OrQnwy)/+FgS0EQЌOXcRaHBqls#:cC;Hf8!|?؝H I}u軪R߈|HWU˽߾{{okPe}\H|t?gǞ?ݳ;f=1z1{-ǵ;͗.L|vXaov!FAF)>yF ZMojƅ|^|*l9:moX9x|Zy$TcA!aCIN0CyP\лTXx1i\oJ<f??1\JR.||A692әRfڇ}oS@z@ƕ#M3׀x̄FS :}=1RhZd*e˼f}5ɥ.֢J<ɭ[C۬wr,}'w}CD BNt[.gWñc1s>cg`J̲=@44onZ=CTeO/,O-W쨵 $ɾ#|TՁr#ߟJhzr!封 z ljݍRbT {vU ha>s9Hf>al:;cGAHCm?LTrS<]'-8Rd~ɿߊW0[mX^oW 7Ʃ`{)-4b!~_Y&i%/U'7o& (`Wk?TJ6}L±˿-}(..Ċ'e)xVcb<6˪BsAt̹Xfլ P߬&AD_'S3j5t $а*dFDWpZzp?-UoӕFEJwţ/dF+OA 7Ôn2 EmT(l>eh z稘[%x{E>ix۳^.ttyI*[PiT.퀖aEh+9N$Έ#1ZrJÝ&FL{vڀ\,MbJi3Eb6M}F=̷kEeT_rcY8;I2Hd5t)hS/0}fS9l7W G聆_~UՙeřҦL*~EwBv:@sqoq؟³ư=0O&*jԾ29Џéa}dגEvBxaJy1oQKq$B- TU[CU_%áe-cg3}D[s~}̇6P N JŌ!'N_ubYԼr7{;rנ Z6ۡGZEϡ-|?}Mμh?/WmEB^ZxS%p C#>M8oͲ;џ<]-/Yb8>|] O_mp-0Tn t_-cy&*sU%^WYunjO]qϺ$oY~W~icR%Vig4q-M)KmL yu$*Y͒ ex0 ]~fX=[DH O0Egx>=؝.]L' ,lyq4rw[)\hkrtx fɮ/xm\~UhqME6Vh 8峼WF|]f6n^ËEɒ?|Uޛ岒 w˗_{L+`kEҢ_p5"ώzlhtzckKp^QBXA0+ ךy!j,`5o:CSdK~k7੒*P@::Cx҆ V#! VDd"%}rbrJr\qh;bm-j{a4Ja[VJV "3No82N^O~^Wه,MBiE=dlˢJV [Ϊdc96ݜ|6ON i C3R(*p5; JbۈC 1?K_Fb R_^lVu-fsi^׹39ºYc{{uV @} sw9L1y=<ܨIbR%q4`S>D9(^Y*[r rqA^W? O0-"wu] ?#S ]`\|8[,a=7~8Z#yGg#6 qtW +J~Z"緬Jbnfs: J,Ϲ\ܺE\Ԗ؋DŽxyȒtxN9 8PZ N7%?#z=oV)hovErk@`yκk)fsxW2ZW;o[7'7+Uf  M8\k .GU(ifMz%mI.s/XK9-do`ܦb<05P"=00^۩M5A8"3+Opvsf:fu@GYGzJ&-ITK2`p!/yle s#R,E\)Q Z?&.}#d Ϋ8߼ FxcPGTB+x\#TsuCltc9s>y8BN[@^F(K̂хyэ;Ϗn/g/QXr!ƳEz/4ϕfC>d5^/$/v/hon'jo/Dh?"й+υbZEKx 9F-S/^П W( `fyzo\| ^haeNU g.`Bg Zy6 wf5*٪( jEB~|33y6Z\Q9L`g/ J S*4juEq rlqƱ~5|9[fy(.&`#o._pwQZLn`Nnm=6* !g'x߾Wo5K`Iv_Zc`IDeFInV[{_>Z_ewFvUMvg XNG[ƄssiU$%_{3Mkj_1z,/n!d`6&m)d\]/Vg&LBL1$atԝޞKp {kȼa ij W";R,pǫY1lv8ϲؼFzǿ[%Z7ڞ\'rŤRg]*^Q>B &矾b2-͖_?#gĄ /?ӊRʹOޒŦ-Иո+0қӊwE[k.P`\h?yzI&5~s $: eƢ_u\梸g}`fX钯ϲ"$Զ a]`V1>Z5 9A I0;S QPrnjHHaE5%ϵ-kZ)vuo/O\ l8f(qе$Ǥ`I.颅vYhtE-Zή\iXFMpĈ$Kah5%@.kdN` ln,MMj[P' f)(ǒG$za axI'-˚ErHNYxWAkRANs:hi<U+W"EL,JAA(`N9Xc5%} /krÇ 9>ZMGR&qN2JjZּ'{y\S`QʀPz+5N`O<}qQ󾐋q}kl+Ǭc?̈V`#`s `6)8>eBkNgM{R 'Qj,P1Fc6B\Rz2o>-;2 aLɋ-҉@d 6:h|F}}FEB&U2\qQDټO82",脸9L&&qј1^XԼ' ,O\CRN  #cK1=YԼ/t Zy.CN t@ pib<ǵaޕkKhA^ּ/*:y_vFy)!hPS؃qd6,<LnkKhA^ּ'䚌{D 1i)##FX&\lxJjO5 *úld⏢K Ek m)]h[EW\.6!q1+dNsr"sa`%A12y(5)h)<&E@/x:&UcXsip 5.V)BQ,a YG,Q a`rK^Kn4;=l5-䀉 ~tGx*B=sت}QNwKQpDWw1f}Đ븼@urv1:Rd]@J w ҄Ct.jt-N&<7dgc{0 ǡI!Lc4e4#;>)`:>a"sʉՍq?^::m]L[7FtH(Æt/IJlI^9%&+S^Zp5Fc 6ѥR+º+(AlHs1 }:!yD31" ,gF)892{tј$}M g)cF5fZrV2}c;ϜR0w\Vc!/yYǵ?Ў<>,;Ȇ)q(xn 'W]Mm݇rsU2<'SEzKԠQt[m5 3I7CwoonЪSUxd12>µu,SE,#C&HåThہWԼ/nGQ*۽2?/F>Z|f_`tXoweT\ۍBF`q0r,K y_P7.*rf@nlkP|ˋzy۾ nv}<,I SlNa 1);vhRN1ZԼ/8T##4؀5NW1, ^(Gu~]tuﵓde|TofvBe9*F@bbqe5 qrX[4&llO7}a֧K<|(E&^cĀΊ:gsrQ1;$QQGqr& VWWepŷWݎ_$=PյwޕRg#.Ą<ÎCy]v#vCI;$RI*1i Y~@:gX$1ăfK n8jF+E{8 R]czOοO'1l=@mm@oq^]_{xQa5e_MN57o/m]o#=rW{3|?{ 9)W,Ic{ׇG-jHb_5⣪X߶,%ʉ4F6Y2!:eyvr++^Wy}-R iD ݚickRenB?$à-6k I7iD"cp9Nsq;́4w`_|km \jʎvzp"l5-D[pBGA 4 iD<Ɂy OΑŸG<£!xHNy nFF~]aP;B<ӸFi&F )/Ш PIh49T?]CB<߻(SRv6u;Aooö&z׷w[ :v95 ?·xktK7:F'4Qˢˮ@5i¯(4>D#$1pYqVRm|!ȸB\g^ʋ=M%e}8@q\0x}}{[ɹ)¯S$zrnq'o^uD=HrzE ̼+~wd5!ΓV_z9ww 7;W?ëնӸVÜ/a;yW3]yyE8mEJwZ9n<-Sz^,{ `aBn;l;i9^-up_@ vwѨA:bhQ.J~]b Z@|o-N~bA$4q%5 caEG򾇔@'OP^&dBFŷi47I5i¯(4󬘱j(P mDuEj j "V!-r%`w͖x.icYm+|FQV*mۗS',"6AۨRbtAW}_}"}hFgg<ԶfGōl͈9+ /DϿ_+l|,<ϢO27;[7_Ap@"CL:d }d TlWWj%1e Ne_iu:isyMgݶ4&#e| QcJh"naD&D o6&QoC}W!*Zookǜ#ےcPq{23WЫb3Fޣ{|yyV07К`,9õԴ%G '_TjZY$#Թt6Mj{Q MB'a 8K<Ɨv";oc cV-Uƫ:+l|KBC45vۇ+\΢sBd@)be EK!<&!5bytbڧ4cp|:Vk T 7ɇq_.~Oٟ%1aq,""&tUL} 'x|aj}Za9 c:IƮ9Dg<Ù22ġ搊:yǚ]۬of'j~'3i+IJOvȑev7oAâubKfq}@".}-&?]Ϲ0hll 8%;hNlcoxzSwCEӸKX1+zUxKLʙ_k zQ˟$XT@uPI^ q|J-םN![`;{d7G%1!B%oMqTh1X|?@S,׻ ~Kxk.%B>U&߂~̯FT hܾa[vl"V!RX &4;eخ5~[kDR^diGQqvIE:=JphDQti&I%,Rcr\=sxޖz<g1N.7.U\1\eL3#qfX(dOc`T2N*_ /A'n.kKc!O[q0dH$SXѿzASEu{h|q=Y䰠Eق$چ(ONzѿ@SE.fF) P] `~ |3&E]QJ9;SQa*ǣggchcvH})*Xm:ϡb($)dEWDwDV<|wȂLBPJ NC1b8qg$B\ELALxUlDPK2ܴٜ&*cPs !b,hd,Cc.twMprɞ`H H !,s.n(K) > rm{1/xO"SOǣBUngPGdΠWvW=0KLyVњ~y pZ7*ܔ6EMG 1);: AS+HΌEx <9'/jZC#jD ({G"(d{\k;yh5 H$r-R!Hs#ݓOaA'wyy b^*W|!FjSxT\"e҅eU$%Kk2E`p@ λ+1P}3x9xV֊2^Tȵ*&ѧ5LC9gO`SIF#~tIࣾ"*T*(X&'`l2bwf+|br>><(GUZr`3T֊E`hd$(iG"*bocz4ӫz?)sƫjЛSk% y|׹]sWQl:ZAε0$IMA9vF(`о "]q䣂 SmGZӺZ0-mrj&+{dR̄ Cf ӾtAS+-}~5 hLȇtS{wKK..6'w-> !FBӸK$Tre=12kGQt4xEe°69d<::}:pS_0(7X%\cd,]ҩ餻Bur@p = U`/ ^ZDdZy/iuf}.)Wr?qj8ڤG1 "ٷSũ1Ec# "Ϙ~~mFtGuߍYqwC2—p\:ͱRa!*+qq|+i·*P39-h4uX`qaEgoҺZ;q߯OrK;W1=o:{Vx0YXEnpCK6+6\RH`69Ls)g,0ʨ ha=_zO9'Δg|tx4p@gTpjljM 1z6gq/$(G8Qђ n1暴ј޸]m~m|hM}#Z$;bK\2;{CF>Q䂲o=wV7ž_sUJT啃y/ 5*hYRRԸ֦mi'6\+OLA3@r U0DܣCB8\3[x}HypH&A.Xn-'V_jlNǗn**.ά3DghBn6KkOkO6tsdl&B*γЈb!~7$y$ס=DsV^ j>R_p)+C}>QoO5բ ϿB>ÃOn!M=J}ڲUN΢~g;׶/V {ڪ.@Dƫe?jQ9w8r,s(X|(s!'8*:)lǫj?5,hz1W(T5}w9FQ_L)(yoFsejj@SE=ٺ¯R,q9AHIFr9U\eMbXJ7eҵ)Q3|P5R)ÜJbTE>+$ںʦ[P-N|)%,F"7C:pI)~B 3Fh%X=Κ5JS~Y>u3?γsاXfR?D5^ ގ8i {B\?6Bz":<9[Ȁɿ?ess]H1ddm6-r$h#%J3vܫFxX5)y4]jtߏxH\r`nfehxPx9 p@붩Q4ά; K?˯If˶&u|)[}XmI#,Ds>_u*W%?-o&xʹB<.M!G0B]>iZb4FpChrZ$ *hDų,9 q~`[|7L)ELJ74 Gj7Vszv ?L1Csfh 3w,exnF{ǟlKIMW+lޭ.|+,?r[1}d0,iiOEm^keҾ,-[#?b3NKON[I ^X3;NF,KW}#[H6arpȴ}ɴ*Ke!/b[؃>dLg~\ }Uj~`34[Ko[7Y6N>qA`A|qwL@zjP-BoDyHF¹<6_:^6>cRQ7X# x@5/fy ss{;1M{#88L$ ΄Bv< '73p`'W-&-M+66Z¡ i,~{|4 "Y8\8Ӎ8 ,ySWy6"Mi{GtkǀeTtxd27Hg }ӄJIh{崔E<8incec < Gas\( ',KW+?8L ,X'a=buxXp]Z9Rm㴲iM߯cܸ+tx\zMk32$ޤ~dHm l8,3}<{?r +/gX|6VFXen?h![  2ϏY.#hҶgd&ϤFs7B $/uE nh+%x1~x07*-E4II*I/b0N$S+)osś/CnA% ,Fm/mba&=P,iDD?qܸ+yIR6vU\\)?8v|y]\ F#$ $RK#:.˫ihFwCD&^B*ް7q=]OgL!!lbLI'xi_ Zbp^p5g WuVyf^oMb;,H+9$TV V(W`4c O~OGPLVŹp֊H4ů_6ԍ9ш{ 6iho,_viEpN۱pPc,ҳtVˎP_0w𲫨b N`IuhEQFAPVb QM*㵓_oנkPqQ<]JH>BUG7!i[tSy|Y)Z>fz?QJfUBt/Jz4c8|0kRdXN*xpcpD=գ,g6;nnr aRMD:shClOERJAPçcÇaL 1DDPH#CROs+mlW5p礢KƿC5Z1FV$:PA߭/At5? +Ol}@ŭg]AF D}yWmP8g\yަ44y%L\izZUr=bD TdZ*{6B=y0)9(s{XE6[O*W)@Hz; F`)'"g4aDe֊i+cQ4N(A|ܮ"K5]}\C{d̤NGlFWa9ʸ4BNMvG cMB$*SFb1"e8^F0e-,kI.GMM0*7&a$%k aE*:( @E*ރ좊1\vH7L cJU3ejQdpÐB<0$W"U|agr64?}SV} >u ^"(qKxb ^*i#.K=5pie$[ӨJ:ӨJ1pdd9-^m)gvEb]Jf.B/炙8tvVT!W6j6@×&ȔƤ၀/Vd? BgՆP$lXa2XBL'P^OZQ'/cLe0Y4LEJ[}K\7ǔ/P]G2TV ,`GYO?_^MzU"tFYH w` RLr]n8SVG< ز`IIa3>Lc1YBX7xn[N9M(aYQHAɏ~o( C&잫WgD`:jX}[T{+:}n87O>MZog˯3M=Wo&Iu}ů&VNt(Pv2O6~27E1L~xA! c -$E~Mc؈T=$F=AmI d$i%(tQeѰ{hz`‚˼ġy-I}8FfE]o~ !C Tb5+EmQ"M΃: ݻGoTR'w-alR^"<v4nM2a)g1>0ί|^sA Qa4YPEP;xKn䰄=r$]bdymU!1e^I'u@68.Agt*x62ˈ" GL& 1;O2p~>^R$jj) G'D,')1SAŷs:Uɷn10S9Kyl!Q+3%urn1#rkSmQD'/gnO2S-*}~D{]㾷^ǷiCOԈS&,@0Fxw|2Ԅ~\Ok< s6&A49TX6v :SK;t!P;q5=cMqxݻGyd!Sc2 yr W7u].Purtn10ɀ /)6Bp de? ˹sjޛ-&H~WΨp;O)S[/͖l:Od؁Tvy"R ݬI>~""xެmQ45B_>ھ牼Kov_A[UpG|~⦫&x3ӲXYsz籒3|>[QrXcRnܓݤn&_l~GWoS) IhL2=!  !t1`K3'-1ii8qLD/q4LnsM£8AGMcv!KbL<`HIU¨vXw} iTu˩!e5K rnQrJ\ímIwjx?X?o̓ in?QV*?oCcWJk+;){Y}5V !yGÐK3.Lfa Z=S4!ȶxmm)O,)!HdQ}}f.HB#P>- :uJ~})&G\N=CЇ IS@ZZ S~gyQ\¦ =o.D)qA(ϊ0yO;#M~[{N1Ѩ}lD$$lKpnD;? {D-ͱC:8' Ľ @"a*,cp@X462yVX'vϚFߠBo5!VliVT89dגoV(L7169 }kXL& v/f`Nv1))/f Ե4csؒ`-TRPR` gex`{&`"jSpV> 4B[BְKs"j>vxw 6‹jzmX>pW}ދ3~iW%DCpU%:*ҰhcWagghi:~{i=Δ>N'Lre qkg](6.+ԭ2U]dƐ6 BS]:_85&Ko LJoueHrc9R9k^g9*xf"SB1N g*3Q\Ǭdo(.2ĕE f1Bnw =~.߁~<:\ 3!ʿ٩ TW%>M$VL]tGR:#[,s*r?;`Vu_9!F0+1`J;*3w =h2% Ri!Y(+|̑+Exmpo^=ChN)332FsLQ29e{OwmdB|"$=:Ӕr|:$KABRYmN-hġ-(^ 4)hӽ TQ _s BDNH!!$ ͂2`d ^ʥen&V_.`o jۍĘznDu@3IٍV9fKIQl"ҜWk2y#ToZ"ZGtʪ,Цt/HR5lQ9{;vZ zijͨ{e]; ItwrU &YBgEU,>>+|J=j SZl!$G"B^:rv?'--U ׬J1XChTqj;-tf|CbCc2V(YQ'c3hrq (Ll(\xg2!gWNBFox]mo#7+ 〻`CMrp&N lbYJmK~Ŗdmb`60vUCXU,VaǺ9"lF=$h:}M\5ִFy2x<D"(xM|&%ըt*];[!G}zN!C;C*k<$HSJ8Ofqj* ]m2VPE}(CgYJK&FtFVM[?=(~S?5ݶG vQ o Fcn򲬵Um/E=m{RwןD%{x|ya@XDʂLJj i -3% n;>ft:?={x[(Cɢ_-NB*Xvei%| z u8 c#["{GCۛn>kه`PWo)ṁMfkϰ;X=<2#`|ֵ{#NYiA0]jxG:D3#r;mnWg,B#1 M3r~DeY<N7ޣu5R[)aJ 7^¶~fPOɓCnv}IկYbvJx.>gi] 2"Ou%$  g*ahF9Gո8V\7^nZ!6 f5WuΌxz;[OkAU6I;(WkFHWs@h(rqh|.'-@ӹK 6 ::O';8'ӕqR=Q!=<2ٳ\q#-9EZ28:> QdN5at[g6Eʋ?o򍑰V]~ Qaz.ŧztb#ӟUP=rF]#384 $,pF E!Ј^3bhJ쏸VM?3.*J3x^8xl[.SQwbʶř]lf>hvphu/G1B^0A!iVU5a=<2hGaKfu&]}qp=/]2}̰$/q94Y5젟6&38>Ѡ~#I"" XrǢ6*4M c.Bl2c )aXgF+5zK xH`|5&Jxf8$Nů((3àHk1OvyI"0a B TQQ詵bYW+"Ao@V!lQWe".U'3NDGıpr~zt_;*xס17VcgDГ=g'Io߭;; >&;(:8~]r|׀HLG胗]|f]T}٤D`ܠ`*031)+08#38 jq̍MsvWQmS"Ϙ@#38 6vap ;4 ̑j0Mz{D*fHLjHL,Y+eTN?>}L:3:zD>4=<FViVLiwfƀ茕6r9_S\s*3ZeDk:Tt*3Z|02 LF\D63 B$֫_v$v6sj,* ?ԄxB;/)|m,a2<s?ՊӪ 5k;t3!~?stdHzs%>Նaఓf6%c{S펖"`Uk̀'L4rV'$Eub$~tՆcھeDջEI|/D金g-ɍhr/UJ0F=3 'NFԓ'q8_iӾ爛T%B;gF%Ac2},*T3լƦU[9̐$U}DIA?/(u#`z܂)QV֘{ IPshaaA]%&5I$~ lSaH ꭛[7iOt0up2KtcfiyFf,1J6 KX*\[˩GQF19u vT4^i餚:Og#v۽sԥFKJ;Gq(Gfph#7Ėɖhf]nk9sҧVԨ6`cNzb3̀(OlP(uJ&N#/8, ldf!)1Cj#o4JU}:Ed&ҥwBw2 NcبLW&/?'n42VNjhq;j+f$3B$i@Q1ǴW>Id$*0&<ګL%6\2c՚*։^J).t;\z Ќ9 FT$>'1ugvW=~XLcY )YS=ζRu(`3rx6b jc.ƍV+HӅݢOy{kPv#kȂ9+CRa 2!։a#/DU|MfJyc>By|E\Qm# kYX dʒ0*nq'y_뗧7H! [7l*Pٽ氧ϳN7,<?r}+6F:#0 ^jَG0b8m[c=yl%vyj2[?55[xr@*xi {X sK,U8C{jh0 a !zx>~yaK?̋9;M6Glϋ`܊j̲)L1Ae%oLYIHR4/v-?oOnRTH Kiwo߅pѮ݋Y#Dvjb>n;C_З3e} !P-'s0D 啕D:NI.B1f.58܌,Rd{Ki^oPX'3H~(2zUnLs*8`r4?\і _gѼ'O`GI9.< /CЯۯ;;[F*ɩ}H\ \jF**#Z s)1ܱ1Ѡ1&{p vi5%TO^ ם`9WRe|I)fKQ:dzK)gڗ& Z[aSѳc(Ac(wJEK#S>6 ~~ 2yNJurv|Wy乬ͪgW~*T 5w& 3P [i jꝻɲfa&! //ڣyh4U JOv_ Ƅm<EW&ڃ)h׈WX#PUՠ`Vzq9%,'B+w]^uF! QgoX]s6ZPxsQ!O5V7cYgO4Tr"ΰCZ *.<F=< FtBBp^4Ŵ ^ M 4Z]D16v\4v?ed?K]cX<$)Ԯ d^7Ӵ X #˱(}tz6!jGG|;`SbboBCjټ*6-^_#Tn@U"A}'> Ix$8XYǴmݮ+>w}U=:9 7!g^`)W-g!oYz-ƫ%Csڧ?+6q^0rWͤ dZPLK zCPq)E' ?OK"8a!ovz, }Utv׾yyg1}1dG2/AL  . I7 їIyӔ%&H߫:Je4ߥJ-҇Užz!UER3ăxh=Rlj:7Uᖓ;,ff$tVcb2kCS1; %&D'k6XE?*bQݤE*aؠwv'p`2 䁭X잘1fél>Cؖx6wXm|-bw32x:n D ֭R,h:.~_ϩh,)7SNNP/-t- 2G=XY_8/;wǭ0 }|I}Z5Kq\ֿkMm+Yg6lҼMHqonVW7>Vf TM*>Vvm SO9 )Xĭ!SVs>s;`e d=6cs sF PKRu'S\Ͷ!۩7!vQ̗wAOm,EWjO7gwfy1؋ 닐Ub{3ZRVBxN.߇?L!cAS o*$#WwC>BỗR 6!SH!B Q-zBxKX/ C!w.%8e!`bհyc8MA!>#($^ڞUV3(1JY8JQA!%Fed|p%1!E{yoo߾= y;Won'ٖ=@AQ =uwMҿ|YeHp}SDžgܵѣ7!60ZsyπN>y/8pvK+NfռY٪ wmmzb/A|v v/A_m4ɐg~fHIlْ26,aMWOUWq-/?xpŶγvm-#7ƯOrwd6<ϝIqL~ϻBҔ$8Gu_嗙nT~A{-v Bڻ0wz *ͪ6Lrwy3Ŗʝ}7Ow_?wGaȡ>QlfRUO:y?=LL{7;#n7&E`W d<&%rm22c&)fNZW f~1xqK&J|GG3/)y'҇Mq>}aR"[LKHϷS9 SFaO:fƲj+da9`YjT&[(KP0X9iP…'aPyNYm۴O{P¼V|7 H|.$Fe1{Ʒ6cTBg1ulЃzzKU[ ݴq>k1LbϚ-k%|Y:KT8c5'}T&Xy`an!@44Bx458y pB(ωzecȢTsXxp 34vxh4Y\= q[">K_8S!&/+doC GP&:D(G%LHC[$S4~>j0oͼ5PUo$X2l(CNڹv˃$:RW A,*0o2uqUРDUރrE| 5·w:8'6pgH{CP¼չ$ѐ"'ܨ,%;k{P{ Tq"ep=:ľKE35TQYi@haD0b:z| 5ײzJx1߇B  Z02#@̔˒ʤ"e)>j0oyc)s 8qf UJC .Rpσ!%hM^MP<@fVݻ#\M81ZP ɦ_,cuq㕻7g jȜ>LHzZޯ<CA03wѭg;.&؇,833,Jq [P W&c)xvQs)Yُ:D2xi`EmBe2W0Kq$<9qڕ l@? l=,>@ Q3wq$ۓ̷N|]ow~8A?&ݠ٫ײG%rM6Z*JLλMJxN*m Qj0b#u;u <|G-yf1Ȼ"t ,%JciwX!X'Ut0f@#PFjީbpwF@p_DN6̠;G^yIöR +g d>Kܗx4fC=B?KmM} =.Hs"GUH,1C)Δ˼֔*2BI 9(R]E:s;^ $ɔhf{3%7FQU`bA#˹J˹9ZNg%/6oۮ4ʼ{9%7ge~%r;(?O;$El.>W3?֛.*~xj,Q9'>OMД\%*)-,C.1blV ϚmKK 6H75ހ2*\OR↿@6MfE{N>pY}i^~nm60j 9WZJ1cv xߧv>~b/&{^ ACa)(/c(_+x6L>\.WɽG֟mf/a(#9LDjl>>\pZ~Y7oۗ'P4yy7ەثۄx5Y6=:oZ:w1. =\ls7mXޯZ}7טYL}#`c@XL[_ob=P~e$zjhrwҲ#(ʰǏyW58ϼ6fN9' ւ[A۾-+r[o9^`_Ka8470$JU;'.E@QZLG@` K@dCAa 3/kXn@5m\ȿLOXjJL2/ o?%RQ, ؐםVԃyW73d^]j2g޶9Bz;oew^ӻvk v{IhnLJ0|10dypT}!qὊ$+Fc!|߇B\Z!SM$%)YvdPyAG*r֚8+ydiMd.qՑهB >&ChIK.)]`e)v9Pyk3g5s.]hS&ڡ}(`^&yUbD%,KODpF^:PyUKĸ&-3`ʌsG&{ 5Cv倒vTSRqDjI`^'2.C߇BG}X]_l>z1j~:~W1#59=RASj|FǴ3Z<2 Fу*9S<8+J/ toVM:jl{%oJqØu(6h}fr_4(g(G0(GGF&S_0w_=yIJ|ZKiʤ_0d&n2:,lymkUnEI*Kd/H(7TB ^=fSǀwE蘒4(r\2!#ZyKӜa(x8 =lnF?nw |@l,=;mo68Oo~k}.U- lP=,{W[:ꇥKC{?bM1[vMa$oU{'7Ī 1yN9*ѯ*ddig$3Y AG !JLMӟ ֣wMa1Xi>ki9 WFkKuIJX%2t)HV:ʴY#J\ҧ{Zޓs?=wNY?/ٝ&~N>(B;Ď#1 NP118W^NV$=HˉSJD~X+gL rY^K.Tê|vQdSTrJLk9Mճqnx1 7)%}pש,uS⌂uhN@TEM蜧*%bQ cIZe՜䤗1bLhXV\f ON= )5GY).*]P%n2f DspUU=16 g șYi׆"60؜:d䝨gՆS^~^[>~(uT Gu4X~~pX`b{I+%MQ&A+\8,2y*zNeg2K~RLZYDG &pӴiYhV4MԪj2ItPks(2Tt,X@Î LQPW*X'gEZA GAk&OXil)Qʱ )aK]Ԃ,pŌ`qжzļ4xˇ|DsUVM>rs""`0ٕ(K6+6XXYyXj@8[0Fl%!մc z\a'fB[P-IHrA*Guh:4x ^Cסuh24x?4x ^Cסuh:4x ^r ^Cס?|;/,Axb%9.d^_dW(/ӄ[+4'# ]<<ݻHj TZcR 7(.KQY!lX5"$gL$]#|!m|G]s882oVw']s/ )Yp/ n/ GZSm>U}k;g5+$)઴8xD7І妺}6]pZ60>z҈Zs>X6xJS`Aki0fmPEdP4T`nF 4hC%#@O ` nWiJJpp^T{!ZaV# $0Yfik[J:S^oy>U lUnpMqڙ ާ8eqY4Lq { sFبdmMryx˓,gה1'ΤO'v\$QAh)#31 YBxr@%DhVLZ"$"MTR#*`t+TT!H% kڲ6rV$f[>,5tMK#X~"`-ǂ"5Je yǼr *ghA޿+z3oS)W6:ѤґS$.>Q, 3j ji?INZ b֤`":/ۙ=iy۽Ao@':B,,\$Zh;90|jVykL!t4+V$LFGMѺ^GQufo[{GA6=BFňr ޲$f5Z*aK8Ff¸,myD`lwP4 R59Tt§$5xFV֜ttTjohS 1EZ!B"qJZkўx,X醣K8@頨]阷9A&icumkA6GrC;3Ǯ=ulٵ@(fҍ#H"g 2Oꭽ^/{GtQ*QfSz* >L߃V3LYH]갤EJŽ,WafpNR;NvaGl =CɳZ"@ 6#1jL1&IN-Sq1%>*'֖qu+¥&Lt$,ViǬQFYJy>%lKy[ٍw?|{,/滗(a$Q r>:*ϩ)E VcAsXP(.XS@5^J׿+ ub:R" :&U18EwZI0FA"A$?/bpcPo {~GX(`{!$zB`[Fc!L`i_"Q $2#<~2kkj"HJYfq@iíKu F'62KE"cov'4d0ϼG^Y$PxR0"`Vp\;K-[$7~2>{BQ:V||S4I`.l$c9$g׃(/oVf0,)2tAPsYg] L . GU 3oFV\OoGrΖ!ˏgt\zu硇uc&C:V޿4EЛ ^?,Z\񇥲w9$7`?;q6LtrbquJg22.2O_f?KbM ]~oM7z@MָOoA[H-EfEsZdKGD0uL­w&Sn;w@t! H ˥CR%Bp=,d:g_iX,x"Sq#TEDE/9$q=~?W8U<_U XC߽-1_v&n.9]5äYf[3M;;=xtLbs W<:mz:|Z$n>2שn?@>[(j69NsVOȍþ` aā8i+n0vwS p(&\]FDNQ fB1L/ OCFxGV+@65in|QA J<(V&B)X9i*t |;%2 Oq1xo,2ʵe$"_jY"t s*JHIvc9mZiGy .2YZ`%-mrnJ3z*HKHUtWQ.n~%Q=Mm\+dl>F\ 崋h'?O?y]٣8.'J1$~xyP[X^6u)-*)dFR˔Dc21#-zx[WF]6Mʻ2\~f}M"-! 1Eꆷ2jv{qAؾW3IHzJ%RЈt> 2xgPPJi.|\-f nҫ}o(IgRޕR/6՜cUBYJ 3+yF:H*0VCn_GS.bRiZExube{JWn1V ;A[M>V(3$^#g6?v v1>F$X#0=SO$5^RiDuPMBV13W3*ZpbhS^]_yUm"R?f 'Q=M-:qaXzȌ@353ȹxt!wk%bNe^0׭/뎑RdaLg)Y |ɪpAaBFd{3=xo'x'a_oSK0"}.Ԥ:` |⦫H'V}4W a^䳛&iT׌pT \?ڑiBn`-C[<;?3japEhc{F6{U->rxR6IM1->ŮIWu %iĦƚUa`Vfˆށ=`W}؆TMhG)bNzRKnXcah>\;ʯ]m'sMפJӽT$TٽU>4vLh+O .A&Z:u@ *>6TbhߋV rR,גXT",[I0 ]hT }.MzR`(kmOW,`A8;6m_< eV,U4"T+VĨKT*%[[fy0g9%& ÏKWnl@Q Ahmp4 чغ>SK jo4ę(RFg^z6prTEukA.=*8*u|_rZ3Rn<⾺Aoޘݤ<+iܠֹ;Iqj0 _ AMb?RL6({YR^| ʧߍ&ȍl7GrY`%xhl+ɼXd́G)x>K쳤ΑυVHዉޓvH)|HmT*i^^N﹆q9Q_J(%,UK0gb6 31,aO$_ h}uǵu{NY~U NҪn_D~D<(eĢВD Ir%9LPh_xDei$`}:bc֩T H趌(ZB98Ź\K!ȉ6|r EK1fW?ܷo˫nާ|'{d#0uL?{WƑ OG꣪"yAnA`TDɎ!R&El|3zDF&H)ArLW]T!F)c LmZ ԀR(teFsY쀻P8WhNObEo~=F]5M;.ߏzd{_=D`ϋƋN[xZf$-YA4:bbyXܨlb3*;Codjhj|&&'[_A+|#7|zfָ:;uks*X_}Bΐ\K (3)H ѕ(#֮gmcA@= TċNӽ2a,BxDjA\X8v$H<o :Т\'x%Oͨu#اZYaH7 {ȷWZXl0jO63{-5h9H4ddS $fT]Q&BFF4`҄RYUL{+v+$ݟ1+ ,#xpJKN*g+=[YJUf7$-DEБOvt奯ң 37d~[K>ܐ-Nqq]lpeu c,LV"@`,`P'тDAZFY/\jͶ=? nvv.s\itU. ^ 8#3ȒʈR(8)rtiDaIT$]Fd6JDNvVMOdf׽2?-b큶eE!E=YD3%hȕpt bhY@Mx\6Q0OY9D_j"r02h|h3&2J9gX٤"h jUZf'-w((f+sƅ:|qY++wW|gv{fWl&dbAN\,<18}Ao"aI:d0,&lGcnn66.[D}XYeigͯR]NAx;MȘ9d#9=$)'6ȸȵ3yrY[v{RQH/zr*WME5>[}.w)LNdJp(\9Hm@T:CL,4-Lr4g 0օSjuHV:fƒX꥕ )]J7;"捛,-+yf/U쎲rƊH2yAB+9RTKK N>_Re;VRt- Fhbx|Z\^)=AW"q9R]C'IfH1h #s2#AX;g?oo)" iG.nJIIg]&e'\z1a&G%u>*UnʊW9](}|>Ss/B-JڈbFq6}$7V8:M &e /9n` ^ l ոh_ ,I`iaBw^4%xEslXrP~lhN9]'ɳwm9L?+_/1 ~,R}/\(yNu/k/KFSdMl~AtM₿QvRYlW#˖ܵUuC2xu_EJ~sRgPWLp%gW5 e8sIz&7ޝfŒ.0K*:0mzqvōA^g{%Ҩ?\Z.^F,E\(3r۲^\<7I'{/ BjvZrzt_O޼)Ic%Ҡ iRز+:qK{CУS{|$zrwQeIWSe~qϼMYbU=kKCzmIĿq GJ6MDSb $*!g/JDF-($_ o,r`NE2o !X-T4W4TQb5x&E䛿{.k"VTQXTJf Ģr8 ^ y<?5m|_oGls>^Ifx*2.Z!u9d0'}w}60`wo$ܓ>”%?4%=]y1f93Qv;_zÚtӏ L?1  `H;7ףt梐q<gO)y{1H]}KGU4Mj׃w[MF/?r'Ń^2 ^rO[ޥwAoY3*IAmnՀHҔ @oln5gFf"_?W\b~7}9ͷW^CwyoF:)3!: oX@Q0FR<|]ǫ<DȞ-I ƑՇDA~beB3);4h]"y#$Wz&:PDBJtAZ"ˈL6W lcBkCN3k3Ug;$? $ڳ{x7!ͼB2VkB ((|H(hlHH*4BFB#ThBBF*4BFB#Th PIP*4BF5BFB#Th PRlb FB#Th P*ՍPSl P#Th P*4BFB#Th P*4B#Th P* C#^[y].إ=|kgvHWu]T((8z2n%9@^)Eߤa|;Ӡ$$FohpCogKs^}ӫ_FoG Ⱦ.pP@?˛Hg`3Uu{MD08[(--ЖJ/]iTGTB mV6MGQGUŷ^XU\ e5e=v t]+]ζ!#!-4=Zy)k?_W[)#H;Ô 9y)\8,-qi2gA(P2)K1hh:dI($Gq Y#Y&`6j-j@):2#p9,Ev]݃8hN>Fc⡌y~G=XLӂwOG"~ZgU8مֻ:*b} :Cr.)Τ 1DZ)TY"E{&gDYFqZe-\'=4f>i;CV7Yw+O%}={lnߠޑ^Id(iUg1':_LZr ^e7-NK/HIux%~ `c(qabdq>c|9=п҉'f] ^TGNew`+[cCGGBGFGGمk0jt |]U`lD@D!mY}Hda(2ŜI١! G"xty!r֓mFT㴯m?HgOCzp݄4jˢaAR` =7O *)#NA#A)>}wM.7|^zId]0dAQwI׀D&NAKQ#%7d2wpg5fd1(@&De@oQA@Ε-!ER rRC,ǔa0qzf@ r ^Ug;?/.l~ЗPZ̤(9l9&tH<{m0%׭n{A@2y-CJw;- VQCK6s n BK S!1Mv4atv.[vbFwO)vuݿ7Cu<:a6* $lQBCBVHCNƮS(lf~֠Eɠ9i'G,#k:ἐ1UФѱgj{H_⑪a[dXwȇaL$e6>2)Hbe{f8~sW݆aeI-xkM-eS.GWș1 T<"QhazfBɳW|9+fcΖ 8s Ⱥ3QKY h͵*o[md' n„bQdr1hR(%_e$20O{CGB^Po\s/*96\wRpfcQktIH,92,QJ$AY8QZXZR{'#VCאv}ҭ#D!!< 23`#+ YtͱO.彣ȶgap%`H? gUh9)hRhf NG%u>*ǛU㫋yK"57_n_{rrOw_4Wy_G`ȎBeRY{+*Ng.J;tdE}ͬ֙ -],K'Ep\[fG ttrE&w-+.*\OO>95h{7bU+[Aۺ_ ճ%pyՂHqi,@Iv] LBU.`-bE8D)V(EeX/rOG4ă0Y6{9`%#ŰO6޷'ޢZ- p'Sy *kwӳӿ:K߹W v<~\̕mt5ԭ[iTO" iGHZ^|hEgUk]-k]#ꊣ^TouEZ~7H*ʪ(]ҷ]]7@Wm8?3еVDHrmڤo<~U(hn (brV1l.Oۥ<3(6= D.ULYHV Dcʁ@SXXaP!F)Cd8&he6j-(WqgIfCB-9{w|&d/1Bx(8mϿ:d\g ?gVs?!=HJqZŒ*Ks2KRFL :0SJV-J(VxlW힮5.{j %w$gz쮡5fM Ï0-<.9xv^F1=i,|ұ61m{l-̖> :輨*q|vPa[v+Ի zUK&?>=@G,C?/*׵_=vrBm3*5L.S]4rA~#Z h{PZgb 'j~:?htQx{LFZ&7=|vjݥWX]*qٓI~8vvU®JU *aWe,]vU®JU vU®JU *aW%zք]ہ[|]|]uUU^i|]5#uU$Q*_W몐jHe0Y*|]U*_WEZ#-9I[*ت|]uUU*_W|]78UTU*_W|]%B=mJ:Ǟ" [Bo5oQI,#0RQ6qCo-PT &CNK B.dv ŤKE;EgA&Mh rM렌ֈ,\I9^$I@o,'i/ѣSZ0wYM_+k!SaI [W#(|6|Pm[˜eS"P#g6Y,:P|6zf-qxq\X *pʨUͷ>v8~=ˏWtTzcEGOhãŴϤuC4nq8Haƈ;W< D ~ѣ@O?$ R$E9IIC}4LZKFg% )g&:!%BHu)Bҗfd*cV 0Ϭ1&d irĹ\ m#4 nXO z@Am+y"i,PxjN:'Z%C&ukAFMtJսR8o@dka5^Üq٠2(n gT{d:%sTJ.|eVBz}>D.4*3ev" #- -?s[7'cRynFs>M9/'7-%Oofk{s=9N]tSb .15Gi3"hRh -Oobwo~#wfn@+Xs|/ڎ;%[}4]tjKj:NswQD{((e:ɋq28 ؗ88}S{, lYgcdv> u(xiI@,Lا@@:&-޼9zvtt8|O=&7tkgu:t?g \:wnK.]3NgOjOg~8E_fF9J^G?/h~fF9GpeCthP QC\g4E!?n^?{Wב] [}0&0AF'BP3n%9uݤEZ T]έs߫{4Ɓe6P$@yKByWGQP`ĈD9ў2&]qAgܷP>Q;e{G'R!w:iZg#ttd^nN06t =oWU!Y} \q[NOW~}ƻov0חh3xR7Iㆬf/]}bfGLˈ>OA@x|m}3 n?yj<3İ#i3uǟؒ24]bAdeUYtEn2k'/ I"6Xow9翶ҼUѰkh7ר'O|c\I!FBA6LEv&leViFVujofUNд1{)h-Rzhh}ʹR/t/~~[ c-|A o- UAdlbt1M* F*S)mvnk$1jPC.%= jĖzԬ )7UuYZl&&MjB0- sOÖP3cRNš1-.bnt.4B%$pEKF=|7ƭ YV1+!dc.u]f( kR!TR2玥B ޅKBëN0dØ-˯HAG+D @cˆ"GtvvH]>"U#I`3<#JƜ 9iB<sqs$*ҋqZJ9ѐJ +)Atz9)!Kw:|*ÔDi;f7Eע$S&mI!Ǒ֑~\+ ҋHchG;%)DGqŢKJ (хk§tp}E xnJ3HTG PL3+ )D!; X{j2P{ڐ]'vDiF*  7ucGV0J8 92r(XT50o&Y#m:DVڕLAG=(]I4ZTUFB6L< !'q֣2E a*x$ LjyU|ڄLk0tqXi4/!!d<nd:B[@q;2q ? ,:?)j`J4%FρYь0ۆjbPb`$Sa`U1v?7Ajz1i"!%:]%|̡cH(n;P PLEQ{ XRr,a3m>%TD;뒄 Xaut kaT([|LJ5DBrF,X{Nw=+IrDhSflv)PNX)l^ aV3liVYIDe(mJMKKGoU4ĽQExGZF7i@ Qmyya`]BVR@8Xu{y貞Ǎa6Ӯe*d5jD0C0vtrL'Kfc+q0Bg'QEŪcԶaZSQt֚BN` ae+J ΀| =ʤ\H 0#"U2 'IkOYQA,J S\F!ܪHO6װ ` Im6ՔF ۥ!D8l9\ iIr)x*xBN Zk|A9oU hX;ST'm(XP5W"˾ͫٞ>PҘMMbw7c ?Q>̊)ff1yFO_ R${R?xh%)讷|9J K8/F >{%PX )\PN@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X * E= Cx9J P/F O @@=+D%Png%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V)=U5hRgW=0gV= W 0.~%P s%!Ce|h:?-N毎NVu~ǩ/->r[͵]_ջDL"&">y[ں\ cZ9̙Lrd*:_lشagE[˿-և!{"j1:-k^}"/YmG +jنpȈrO<19uPJ?:ߢ?)lPOǏyQu7@P]">m} Z፳}՛A*5;ͩоO/ 5t>/xMz]N[0[AVb^"uUNKy_:$sTSo4az~sܙΩ=+8h@z0!K;ZKqEDq>@][Oןl1#A/0F=1'oc5sY|":1L';`/xt \@ayv§>IU.L?nmdc6Fʽ*gX=^ -H.z\0p]oƮ}mv]r?uW-L~.3k uӸew.jӵpz>֛`UVSZ)GvONNG 5XvDfbhᄎqხ4OީX9Vo߼ ] ֈ!S I2Y.6cUgmcQZL0Ue{w;0{p(h]6ыNiw˱U{Z˧scco`7a|9nz+.Lv|7s ?mIZw1tsolyH]ʤ\zV{Α_]~+"⼩Z Q b5HBQVc3©yR_BFz7l2mTŊh|W!N/} yDŽҿ1;RHߙY't^ZY':iINuҬf4Y':iINuҬf4Y':iINuҬf4Y':iINuҬf4Y':iINuҬf4Y':iINuҬrtҿZ^jW^{ý ]ZͽA rDߋu{AnȺ6g//cY)Fvj/%kuJlko-C^ٌ977Po:g~믎NWgUӶ}I52{kAIev>dF6[3,6exv%_X]{o#Ǒ*]$r9,\4Ii%ݯzfHQCj"1[!=UuwU$39,^~|/aN, `Zy!HXdnv>G8An}o;%cHǖ W`0Y[FV7,2&4x;llvm׾& R7p!=ENW;;T`Ӈ/7  OD'zn^x0|^ jQ2zz7(T|$_G׫^)]KӜL3Ƃˬ1StIWۻq Ao&ǟ[3s7NI2]Sft#& -!Y^8!$ls"!>ϼ ٵ}̴k/y6L6r3pZ;<8u@7P_1bQژiqz= 0Z-IAAAf豾-F?=aÖ5ڎ\f O`Kr_ߗ2ek8Sq!fZY ~* %>R3X#$4>ۚ9Δ{ӸfKJ,ީ2?D0> M˶%Cs&& !Y4Roy2mGjMI{AԲ\EAs)}\ T9⣣YjjKLinVu>? 2Vt^tMu2jEL0Ji#6&Ȑ(l$:ި5ѪA=kl95ЫcR3:IPCRH-2*D!XsNLQ1EZ!B"q*5?Л#ݩGs1c#ю$dOY%A&ɱ:_C+G⑬zdG|;9&*0xhW]pvڑH"g S!4ueWU9`H&%e{YXJ#k 'se&(c9?`!y(0bI`D::&vZ9@:G8c W.og!@K*۩[_\#sIJԄ$|כ/:d'qdk^͍$,^c2q&gI7N'VY9K(<5چ(Hwg%nOce( 1^r" $cLK21<]P޵~Cmnk9yJ,e֟^CX8JzX>>s5۷{,"Y_#&pY1!"a#A29HyPb Xe]S/rX߮CoN).Avҷ0M OY{Jv~ױσ` |Ik_C؞4x"Ee~/fO=![X%p6(V2jK7ZZTR W@/i5i&"JG D1EAZaLv81Blq v2G$A,ƹ*!r֛\o 8b-C&AJm+y@YXtXqNwۈVtW\zԚRl9b "Yo!s՗D/%LA$ n8>-ʦB~{?|Ob&?E E:k_v8~|ӂS/TMD~[V+vW;Kwm>a z1)/{PXYg[XY^,g}E< >|WSAvKg))w4zeH2ogI9Ye)"x6Tzˏ__$zW7~e f}+ۙow4W0A?Vݒt f錳S_^@{û|)<5ڟ O 0KWʏo▱W#6*378 ,,:KgcʠYmjNa$*8ʝ7\ 3f4j|BHbaJ軭f^Kz":{7u4 ` Yf>{Ud[ Ekx.2}TSjeBj M(nBd#^Ϋ+9*b3, @Qor\#%/H ev&EGk7p1K+90Di,m/=6I<)'r&Zgs8F%b͊lShLHf,۬Ͷa_Mq2md؁;͙Ȅa>c,j6R) ^"e,ÓI~'ԯ1M omLm?`z~h%j7J1q oZ\м$2OX6onVO۵] 6d hӠtI$-_m*q X\L[;n|Q=;}c3):"EqB IPٚ97?L}EVۗ}xkJe!84n? mtkJ ڂi-6EɆIْd&:`h\@@-T/59_uRbkRgS%K 8ͷ  g*iG58!c^~*HiqdB YNq`=r:nag"bmo#O>qp;cXIx, y ^um#Mԙ2 ywx-%?4!=TLPy ‰t8;jA3`}歹݅O}`Imciڲs.I7Z;;b-v ng&χdB|zTE!ٰS(m@uXgCxM;tG6kBk$]- T3 "HjQwfF UEyWkEZjCuE|wSi[_|Šv.R ָn[}UY,X0bQ1Y+냉KM`# ` Xy$RDw~5I[wtleSmuq+?CuQx7$+O5ڝ wh5Z`(3*h2:ڀmT 5!ӄ1'.UZ}h<wXߔaM.P SOͣ ya ]\HI-8^t[tޣej.P3qWuxUwÉ ]oGWrx~TC\5K I-Iɱ[=CR$!)!yGgꞪRu]1c<]_*&V@쿎Gΰ_\/ʢMzQEU}!@0t\9Ubh ]L]\ ѱBLR{=;ֆ_?#:1乃ZD" xB' ;.gEA%/W>I&XsU+P0%&zNG4H6Fk3;w+Ѡ?/M K(ٵ 4duafuտS[O%{i(Ǻ8v,W믨866o-Ǭ A4!NM 8^D'pkҜ 2dIQ\B\h&n18(|k6V;E!2aE  в&Ξ!Hg)c)5!N&CpAd!FON(L &`L"z /_PP̧::wO>+vd=E se0uK?Ldm2,D"i@%vU*F1<] X,0rMk4 q * Z! Xt Fb]8Tg`轝,9kuϺ4sBkVJ 궒δUp,bio_|la0SB3yY '=.oCsXY/jlܳ2/J 0ǐ"1Y)2Mћ ŻzCHp*q! x.eY[ύEpWIψO93%ት hMpj} b.mx1AͼnJln<6~N ӆKr  ? zg;w N&v5.?em hkPۛ!xO^ͯj?3E 8Ӂb:G~CPwp5ݣP(MV+|Eo{7Vΐ\Ͳ|ez[]*ebJJ˞PfYj?3'M\{c z+Ϯ;v~2zU3BeFEZ|ꐣm\GEZS 1PKwQ߼3&H|2 Yݣ3}9k>*h\Ma~Xd_ǥE A(2\&ާq؛s|$Ҋa@o8,]{ջpi2Y?7NuK;~xs0{*ak_K[{YZh@LbcQdx&:gx&:gx&:gsNbJI ]wɞbE?}vJ5*G^_+?w |T vf'M'wL\'ߟ~|LڥGvwIɖj/=mI]{.zwi+,0^7^(a7^ȋEk^& =z?[-xA^ ˋ'.ះza4R}L q}m8Q'zq"bŊЉ|%?9jA({IS[lr^愂VoLn]k(be(,ס~=(yΟDmR$v]9C"e`T<j8rLX S_֫><Xd*7zU)fvGkv7=mBnm2bCQ 7['/Q ߧs>]??]TOoos8>ɠﶏf >Ye7mxjݍϢ{v}ۜn>:?WU(莆g{xG<)-l. ՜w辬1ggNW,ѝ.ue v灟ϬSԠlء%.@w‚b ]`4@:.qzi)>HoZYJA"X d  AkXf .rK>H:r~B cE0*'Sz6n[vzAD:٤) /k[p^NO\/77GqoDt̃w Z2J ޳<7Y4r[Q8{v+ 3o3Wx!SIrH $\ʫ#HtFjhβ \1Τ!9R 9PK Q)CD@t1*-޲&ΞvV՗K23m/:X+YBbѡEx_E 'X'h>DZ @=Zqu:hfEgBp (svA I$L$b˵HH!sp,I:A'u~ y+&y$@TČ4RrV{N(^uF bI'$D&V&I66js6D'V؄&.mE\6g|- j}Z- w#}a<9%\Hb!LL[;Ac*usri}gmEDPqĿR59L7g<ߧxQ CHeD/!&yLpi2 cypZB"y41[d]u!Z'ZǬo,z7 D0|(Yd&@"'ds%rʏLX]wp¥נG(sZ^ B<J0)˺BG:fy'uk> rRA3A[ ձAʹf|"?Gt1?^Y(DZEE_pݹ&hsץyb?&/'? TVl9.=lٞg qTlIW1zF5F `JNl>ș@u&2FH\5WE$z›~u{@`ZY,jǿM,W/-aD?4\v Lsqʻ2p@*պ pQ-u.CAmk' 89c*q4rR/Zc\ 30^&$B[!3;8U *nfo99?|yr%֜wrm1űs3&IpB ϭѫWzqE(4C\HJߢs;/KQA=s7& Fhy0Ad:gӠ\p*H\@ńd%+A3CMP3@), lrTY@g-ù pN6V>97cjGa t*W<5 _Z#Q*c$FIO,:`^$ZꒄU(b]"S- #(tۊ%ٮ<tsjRّۗ=MrVP_vٙ<2-'Td% Y➠&LKEV.JrLWd) +ͯ<n{UxL L^BKgb9/PW59vkw/8AQRq ܞܕ/+5ƣ~UHfT|2\b\ <'"Å !})n?M?5əEi4f71q"%_ܚAۗHq2g&:AOve-4;M[m-6pqPnUfyK y M ;"KH!)p s\rkfڥ'n% P8xFLѦS Rpc)ty k5dgԯI+9$Ԉ+/%S}6lA=V.&z(Pc8d|9?YqնʈW?ޫrՋ_PTg1Q,Y0e Ѥsc6Ȩ$ su$!ДDfgȕʶfP,h  gDw9gSf;jg h5ތ˼H7xQoI{xI{PG.kOhV֮鑣$ʳLtնIWm蟭N K3esr25E4eUZ1.@)EoCk'(l@K ʅlMjBHΑ!Z)Cۊ^{-THƱ`c)U1H*TMhsY:ӺV*Nr<&㱺g牟< Fɴ9kU m`;8Kpb&hhF&J19<Э’zR@(yThL!/YdKhxD mTֶaa_X:l٘єjIRQ 5D%UڔqѠ e{ZG6pLXB."'Z>utZO"b£&K?D `Bix 42׵j2p\N[ٙ/ky W,MSutJrV@2"wmI_.#! [8w%{1~\S$l+߯zfHQ4/͐"˜aOO=~UZXٜg"jCzӟ*A:9\ڳìOuF i4%f!BVNo7 aKvD\ȀSY"3nei?iG ƚ {,?`4;~ "a1[E:$ e0ϼG^Y$PxR0":&s,4rpL2ܞ}$!\;_c|[:Khu1 XJ@4oևaol. SP57 \.Co3\`P͵J`X! )K K kD]Z%h޴yd_ƃxAZb物 EuGZ؛mB<ΒᄅW*d}3X92ok}6U>͊;3M{p1)giJgʤ6GJpR^/7IZVlEluzM_~.ÊvC(bm%8U[nuvv?kwpw8c# FR0pxĀ(E;x ~w 8##{츝zplJ{~A< RD%p `c`Ǒ XOD4whۚ=t3xA{o}[{ml]1xGYn6k/҆6lry8RƇ/wɍs07>̍wxGa}[w" QثP2b*D1^2'9ܰ~\\4w:qh㨏K> +ؒ9|~WW('NH€Lp>܄v"t«wjdtw~Olf?8p){x +?}i ߸MwHϚ=wy$4|wjԸ;tkZ 2#)Q0IdAcmDa4)ۉ6/IwxT -،^ ޶V[lb-m8OTǬInr(ol˦xջ;8w>f\ A_y󺘎D'WD>⸿}0,vtU{QY~K-c1[Av10 .fLx^Vpb#V7x }*3U>.Y VGK!zƌƁZ홇]|HbaJFn+}RۊTݺ]/C6wd+fC1ȧaT*|[:UQߏJsvɞ{b J73>||>z/w^ R-$];wo3ѵ=I/+ޓXfɧ;޵O[w0};1g'c`l \FbrtW"/.7͖җfXrxN+X!\* 8~MY9}8F4x0̗WZchFiOӱ K%!B{=!,0]GM#QM&aMLfp'Ŋ(H"Eu"L.$HXD,|>[3V}tҚ<Hxz(Uya[~6Ai}49ܖ x$ )eZJIDk1hFs+%`@rnaS;)XDn1K[Oi qQ]Odp;؃`g9 IM ר]QQ8 ҒPNbXXH.Ռ]6}mwmG:KWη][cяg iz?(n/\_wkB}%`UЭS\-&x#32xQ1g&vk_qڵv-gm~`0[pՑnfn>4p05Zh6D%u yIEt$LEt!"'Ukچ0~EY17)f:$( rl-(!|`#mta_XetL+ : ĤH[XA`IrY$53tc-AˬiYӲia=(ī:I718jDHiR!sm<UFVіr vʔr 8M(rneZGEdQ0A[R.k%OƮ[v6ݎ*zzow;[xmږ. `n6=ʄi,r>e3ue1\7( cЦEp,52v7$L0'$;r5vl ĭe)|=xTmgd9Nj\Bp1S ߜٻ6U`z? w`1p ik#K^Iv -ɲ-eKv摑J,#Y"@SJx->c:/= }Sy:p.(Lemo3=]W3+[=݆Noߧ%ߧ(u: L8]~7_-;8|(g%5y_K Ԝs6٨ Shvj֥z&J23Ta\{MxJ`tڂ- m|,Dc^T:]u^Vem֋]t-8]߽ĜyczZii1oRqW%eY6ބAu9/n1+O Pպ[mcIۧRM*_ qpp<8nJҶp{-$&̴v}(o{gdcl۾?FM0;1@Π>-}*t7hpmqZ{}իnW{j7^W|jO[I9ϙNJSaן3"ג6/!_-nL=OfFg@b WO; NǬ?uzn ad#j]G3o j<7r<״竾]0[szo|9K|8`ȅU>jкWO cc޻a{y 9>k< 8aD"; }B )Ck'\O> YۀmQlYW~cTE?"Sϩ"nBmAqu"[=,#;)$/6^.:n?}Egwbu"7wLft&{?|g}}PyV0I UXLApdSGmyeQ|lLOy֝hyYQd$AAF0By4P 6Ҡ 1PH5lI 6B()yah Z&D4rfb ;tɦȩ0qho=./V&?ڣ Ug{|d*ϧI8 =gTQǜeJcA 0爏jd*46x5tipg1q&Jr2jEMMBR"Ff,roc YBxzhP"4D 6X:ƗDF4QAJ"W BJkڰ5FNM=zԌW{u _K%TU тF8W_rءHR!WBR w/EK,y=,"FQ:DI"s/=c`GTk՘qx#DGRŀD -d❣çV+n_xt)pfOXՓăZYhYu} B]rX+W5'a1#DhtQQ(0- LbF1PrQcin*kv?w)ct,ט|a'\_b^~F~dȣ{S:ec Mz쳡Cm(zklEق_77+MW[o6sk/9v)E`] 1M:޻:-‹Xw۪N7V/6A}\EfǮ!l{z2ZO M7AM\U}%3vxWDzu~א'̒_‚kFWxNG~ 7˼"7iFg[Iٳ=+jԂNEG7M. 0l>|@C%`0.9s<;X/]\m:,TW9[p٧0ܘa:Lw& oiϦ^|y;\X1ݡKψ$ \g,c[m8ĹhTLєkʵX&pCtE .:R uXb b\'N/X=@rj{Lѐ24^f{R#Id{zڞ9:cP 3x`y1'e'=Djs^c")[i0?ۋR8W}pw7Mo<aކPU D F7?gP4bFQ)h)9_Lv4*zud fBI x"DY({f{19 @_&M)fRW|L0_0\a5J).$ S5!"#R'@N4%GY9G A'hbz DFEhQ08ԝY,1i]Nwe0<_weޢ1b %RY  = &uiূD0qGcU+1RT20$hg$NKg3;AEf5݋JIUˊԸHˊԸo\zT!mX"mϭʂ{m85\9J͠S6(W?Z|v{sr/BC۞-bfs[SMc)-~PpeQjWp~9 t F)#(S A )&;){CpϗۺW\mkloV^{d۩iޓ{eu_Wld /R0)E@NoGPj\h P0<^iG$#'gQ 9`Xe.9K.u(`kZDW ܸzt~%ݵ'=qLXO]*~8UQUEe[VE;Jνh;g M[g06ܗ|j읇 ,g~e +xwgW/ܤLv{K6ܸg`Ɗip5W '9%ָt[Go83Jaz *Ŗ>PMxLB0#BXYL^jʈ`Sw@LꝦ<)cbG9 {~-g=iՀt{ ;Er֞w%>]mVFތq[ |鞃?u^9ū$yX:i=)/Za氬՛VoRod =!"}X{))J@B@K,s$#MM2,*fEt0vn~qp[>[V-~ )okտ[bV/G/jlUS̅Ⱥzů*+`ՉG0֫ L&2M]@ Y&0.t9hy}  [[e9HJ46k_w9I8::0Mik3):0 Ҝf s>sVp3ޕ*fLR]a 6҆`)%f45S$3<2[NҜND4᝴׮Wɹʆz3(/jԾq86tp+I:J2}?m:kr:Da<}dԃT5)wNw9Lx-wа!>zO 7 -12|L3g"U m:saxI)}3ǕQ{K )()XԨPzfG&syDp- ӪwﰩͤtE$L&S+t%@ 65NZHib u4RYC1u+$6!`BLJ.(K)µf8z5|%1rjfYN.DI`RrٴT^s9fuu>[dWL'Az^D^C" |!PeQTZNI%hD04(8p<-I F#5&RBisRRd1"pL׎i EAVwڼg [!&děOL9jcq'KO4wh޶1ucl^#}T+3y^ߣ'礛hG@{ t%8{ @xP7|+t *Ch0AMPBZ~N=+]O{\"TmA2pQy/JE'K,  FH6t ^l㑛&89'Sv_xo98l߭*eN矺;,Yzkf9]4[i(1M: +{l(FpuGApQꝷ督S]׼7mj"wxCi{\m)1Bi5z=Q|7Afjr1}m_gxY$x VBoMpF7 .7S gk\}6WY h{7Y[O^۳ڞQ۳XS: Ž "ծޚ 66NQ2}6t4AKS76Ae(ͮG='_J $[ )Tݥ3x;[`ljrT:&'LR ׿_59KTBԽ2EWy4r|a gO}Yu>\qgϣ-Kk]6W{ܹfݹe~ ӧlvs+0~c‚-ͬכ{;>4鵅ʙ m=?`##E;ս*T Q]/$|9nEyJfFze_ -^A[|4#t' Zl6sxXǾ' lΔE:s *H!xd)y[!?P y tdzi}t3S:FaU.Nk U$#ČMJs&.va[R5[ &';mzʽmM-da [iV:!%^\* $V.:mSA&ɕds2`h^؉ɡGa/i_kjn֑6qtno@Cdڏ*-*d`0F"g\D1A@؏,8N`h'!E_I޸J2'kOl* !,H["B ~KeNTtK8^OS)w$bK=#%x*\aa[a/dbP`}|lc]0k/ n@[KFɠ>xt 8̋^ ׷ӚΞ48RjBtN5H"4lD S>8+ ѫ nYQi*[6SB2$  |9T 9<1J)Z/`1:J!w"C#B9:k~8p:%r'b]xe+̈́՛l0?Kx=͓`}}*pj:ZL#Z٠Ѯ PP15͢ pjkq&-b*AV$A J YR4e/U$9PISi6&c5fl4ZD1M e ^/=  usDCOI$_ |Ki-!\֛QwTXTjI>+FFNH cϊ- C {ng0_a,.#4[Gm:(AIAa):;I%NAYؐ38<VȂKBn]+.g 'y:Κs|C˧ZWG9#"/l`$(1QE ]y.|>`>Pnt*CrD/JҶ0D`ISWfňCΠV|p̤>6~;m R_xN* 'd0HoutQ ww:Xw`cA'R9~d8) lHni:%!*8Bi#S sɊuQ  5XrAP콭WW'~ГSqL;vJ!bsBRAe2G!ൖ}8hsL )Ġ7ѻGwcݕ<;;J* uR'H\i4 w d!lr&ڡ/6]ou,U:x#bf:6dFrh̲A%R&:'Ig/t}H'l)Rː4yIt-psL(0$ ٢W1xpIM:MYRpIzL;{H]/FdiVΩTGT^ssk|3깵ފz.;uJzT5uC!_i#yyL"Thip72"dאrMF7.^(f: OH=;bzd (9VIl܂Q%; duUur.Ocf=K >zy$ ^vo 皆߮.eH^9)|&QEaV!(z>,5󟏗/Gos. S@P!E b:zP$8@o?RX$^4N'ۄR$(3F^fl*Oh1D2 g(d%TT Zf{ddĢR.,|lyhګrFʓΣ+A.JL|%HWZ]t1j6AFw$i/ʤ(ɲ 啈I*~4l͐!X4WHOqǙ8_Qd%7;awc⣁ I~.$,vZ-{,lnlӖdƴ[#[UK O*5,BZrPfKZ# rQdPQei (cDܧ.V3[*ĝ-A?WȂ'Pk#G3p㏣Okb214K78 ]ؤoBiU Z?„Ftbj+MeFPy7'#7`o'veSd 8u=?MOxTŘAeїI KY.̶{Oem!}^/lmgL)kWyz[ӵ!6u!/00?|?;ᓜf/*e^_ȏ­Z -CqiNcg݄BjN -u: i?N3>*]sQ$D! o"$˫̼i?\ y~_Jɬ i>>^b% W1<|o?ea*-{q?&DJxhc}/nLU* \]M=$c;sb/ 1sUƕKO{2:?&=iϢJ˒(Zo9[g>"ܯغtm+z&_ʶ| y[AJl,`B % l3iu $pI 'k{@7 8:[Sz01#^JC*';1<hnh>3dTyJLfp\4V_⽇r?b܂\ts¸tQ)W# ϡL\:~RSYϞYEX :Ȥ"I7QQ%3Y 4ZTMa;EJ/ʮՌW2T("J%Y@dlIl/$5 ]5//N8?HUKϯ9U~6_f5\3d)0YXYcc6IWcWޏhP&ʻ@olԚҳMķc({+E={t=7ƣoX#|+9k~03d8:X28 JW3Ǻw{LJ7umt wC6Ic' i^f=JaLT49B; m7@ 1 ϓ  ~PH˶v'gU:`gRYcTƣRz:JJeRua@sa>n<VLgPS=AZy-l*':n/&iTCN^GqIG׌bM4~P]ߟސTڞb>G:G2kⰽ'yF3Ʀ/KiRmCkRR*H^I]92"[0Z F$o5@KM[3U" +E)cq_(dYoңvHZRfٮz]^?;$fr$E鲛3c{<7 &*+yde/1z\Ȋ@EKV!'Z*>* \)l&#R"#V>h׎utj#4- ^y.Ǘ`5(.#`?:FU d  tS ЩC@HhiHԛry< u9!,J@v-atkR3+:atd%')[s$Q<" &@&ZN7(c'/g *r2)(pv8KXH]]/ҏhvV0dEhEL= $B-#% *Vؖ r8 Mmqb)ʇ%\刖jNG; e}Jo'[P29&eк+!U꽨1Z1pB*S(H҆]kzu3qAR`KTۑP< WXxXVX @P9J; ΩqsA^s@[dAC8pO@L+ȍK6x8.˽6{ͭg|D 1G`Y0Xh=xK%70XCK~VD9Q$S>.Ut]$2K.=+-֬2ܾ.K %h,D04`6CQgTJDA6F@D&SqY+j3x)mDՒ(Shj8;0 =BmrQxk\ץܶgPְM_nJ)Ԝ|WJ"8/ 쌒<HIQIڋbq$ Ls;F5i4xгS `BBKtDZ̎5k\A "q$lgu *9K U@y岵^hã++.g댉7yYg^<$5Sϫ2APCQ/ZX_KebY#W,C( α\bռA_ߊ/ vz/b-M(,Œ(Kp & ja3C 04+FAf7~;l^|_xOL: 'diRQxc+ww1 䝴}z0M 4Ͼ\ ~3x"q3[^.4JTB"dTdu (dT(mpE iжvۗR|D+ͥ]gMT@^*d_";mUP,%A=ک-;,$ڡE[X΂_ucMa>t}t֏5?&!aȍSn\ /ve.ل"Y9}ʇx@nv}5. S d1[P*EG071JKYiA{Gj & =|;ϣteLAyTwjZ w2q}cOD'E|ݣ?O>M.M`:_y_qNp N}{1[Z]6ڣRRyS{Tgy}'Z$hIֱMN C쀣Z)Hu5(k JG7i i4J'] 霵dq)vOU"+e+ pk eH0*`ucf5,UT܀'TNGEG0BHm*j2KlK 䢥Cde rI…P ?̶pv$e!uWחPⓉfwӫ닯A:~ݢU+t%QmZ^1}m[s#&+U`5VeEֶ֢6 D62)/@o2m ` ;c<0 YCpc .+.)Wr*JUhul85x%zz\yʺlώVJM@}9Gzb,Yֺ98h2=HM>"u3ZZP dv]p3BҮ =x8x=jIH]$(˦sɺd1b0T(AV(a%A]S~:Y7?VrUګ׆Ͽ?_~X0~Ĵh|bYEH|l"Vۀs\ م.V6jTJu1={f!Ál 9g,sdL"LI0+)Tݖr4<jFT7˶S+]lrWZbm?@x"ZsS『3hl g?ػ6%#QՇHa$X`_OkPvzxyJ5lKaOMwꮣչ.J8݅JjvŚLpa%Ƥ :jw2κѤ 'M!JmɁ< c^ՅA5]_{ӈW/M2颻M }3LJ7XG"$Dтky R}soŒ}}#IW}Cn?.K/^çvUYetrsr5dBSfǒ FXnR ϵ))hCndY@;IU\ 7;k]VֈݱzJ6~o;K>ye;\!`vzYwF._Y:xaxd+^~jְ֫ SL6eRۆ}@3lڶLOoZ]ؽiiSrK[[%@ Ywg#Oq1{{4015q͑I# MI4nt۰2{3ؘ_xd°_*j]{je0'&KF=aS7C8p436CAGu}Ћi.^]ug=Dux\Q\Tw&=lECWsWC 0[y NfE(GCgrx4ZOݣ)T<7C!BVg "j%?yuUةAM?4>,L",w78@bRbB-@RXtz ߽s PTX!\h5SJ KB)ೌN 5Kٳs!S@O_2/<<'a"k9/'U.N o`z&tnݯЧEx7vSz/Y56"w)G8[1Lq FdT&O()R,i36IIUHFdIkBrh67k'2(_x/bي2ˌ`6'Cv):%Bc sJ">K %r.\s̈́Y4=)Hw(,0@.1M!C ĀEĈru>rL`Wn\dŅd$ Ɠ BG#[TAZZSdșcҒtM!gX1$IH '3\l" ։/ͣ%$홈;IAShtwjZ'DmJ e+M aCcVhEqh%ZxjMٽORp'`&l v*%3օ"s4Tjn̑)/:}P}=&;dg=)f,oT1ФrmnU- Ą\8ސWEm\9 + C6eoWXwaņ^igQW0E3zɉJ9~.y:Ǝ_K4E/ev$c<8A^"@¹%:q&evF}"q9 RWn7#Ƴ‹$-̵!H/HV& ĝ/ ; ]@!C{Sl6fPV3.)z}N\aP{** 8x49z?M. _b fbv'NdSGhw?,Ye:{/͙g.-OS MEԷ._4{O%JKrrzWw8yu?47a)Ol\9HKaVMdaEJRb]F n+Lp%2\Nt^oKHs`=wȻĬ&&WI{Cn|?Gilur>瑋$DH†o]c(;un"^]mb.){OK_>υ~-n!'9"HDdS_-^zzj + FuCУӠDjT5}I銦ye֢p>_]~ i3So"غ&P9ߒ-)? vo!@ʂHLh,#.'&vp] B\u=넼B8-(V(eD o̩(R@!²bNAuͭl4Taxb,´[e:޳ !R'7^8E{#_EͅP^BY8tP6BBXr_- UZU3d1v=HWDOz {A" ~6n23_'y @F?K1hh:dI($!}WǮ:vFuzOҔr~Y0͟^\->݈gЬՊk"Ìĥ>+´:bbyXܔ`7X=#DwۊN)ɮk6Ԛҍ\7 >gVƭـo;u"U.xiۿ٧.:GGglԽ/M~ز);\iWߞC ^v_\6K ȱ3)H ѡV0ep'mHޞə0Qрe\h۪}ʥ^:t_^_oB0AlOrK}Iޘ+& لrL.z.Ew2-dm~~ W5n5ntao7;_o^yCz֟/ t\Cf^=rq9K/OXe ›N"wPՠX{F5Z`S+\jj=-Jû-oF Kߒg:%z6<$,#vbSm>bn[`!4,Ad>,[(t.h-MҍnyN^o+BRRanS18-'Ef1Z̑읕΢W F?ƃ/ KVp03`MF1^+S,X-cT2;@kVX"6$FO19xHIn p1z -FvM[{W쪻+X{ǿX0E˷;|vȭnל}8'ZUͨ&޳rITO &1[/K NsII[#gf܌R^EBlYB wߗ5qGԺA~'7Tn0}7P Daq\Lz/(M*xQ Kp<1fl`XC,hӆOrX%LԆeToC8d$L1D&(B 3cyPI;׆}}X'cߐYSZ'mV˨A"H>` Ʉ@@\Ns%j [s IZ'iSZŽ  cM4*jJ'5|&o.zYljQZt5T>sRr.2*&2u]J7P{ޕ_S*5 ՘L@F2!,z5KA@-W ,耳«`]TN[C,ǔaMghХ(x#ȴh'ZDۼŒ 9LXYL6 PZ8钑y&`Jd͙oo 5$t$a̫l9XeLɫ6X*2h#pQh9 \3j7مR om"ĢȌ0:;He\;o/IruE͛ :0KګφE2GA( \!!^!)$آA$o dd.rеOyY;~hW\(Wouہ\1q-3yٿrE`7-lF"VYc΋e\pFww@d/TrA/Rr훵_2H/T)N0A3);4D*` ɕ< lu+@ ]JɈ*k3p!+ 5Ƅ,WhmkpjY Re=1WUMy;~̺zrio..6y%CD5.i ۼ=yA^bo~xrk$FN`;W/Ey,WG.9nAf+PU '}"{ AuN N Z:> jfBsOYs>lg4"r4$ H iyhiLγ=tؾlxb3fAS3e}2HZ]yo#Gv* nWA]È`)q"6y=I∇Hʹ`kbw^;clwUusd>el55n?ћ5[&]~6f;!q<~i7-3? > 0077?yw(l-Ws7XCw}׏Ǽ>rįkznā$ğ9Jsb ]3C\5 4]?eZ ءh&⮕8J$bHVY.Fe19G`IJecIS^jhAZ!A&'ux}$B cE0*'S$z[:9;BzL=4G2HH,$# 鉲Q|Vܝ64?NyPQKFi0`{8΅l/$FT3^8D,μ͡?^%X$9T .e%/Y+ƙ4"#G!*0tT.$4t h /,gQZČF#]b@P";s0h~~Bdt/P}ӏU};+ŔPU8v}C iXݣ; f2C+mHQsaef'A› UŞn3 # pm(vy3g7+/G11C$l@-էʊ9պ vQ)ZD)k" +VwVb!pV=UitvZ\ %x`RmLRv`rC/8Q0U"grCMy75kԦWmau:PƱGGBG ZFKϖg4t] %iб ) W)@CVVE6@Ua=_^f//ͬj*T#\_s}~S /Qq-*)z^)o]^0gi5еǝrqƼqzYO}/t?}벑._,^nz9J >P1@C 5A)g0Gb% zVBndU@;g뻳| W;k_鞞-Ezr6n7r\'$^0'+u/k  P/_côw^%N6ۈzI- n;ػYږqYg.i1i-^ӽroitԷ͞iʷaa«=03˯4tOc`, 0jH:}ZTNI楕C!y TR@aHG% E2>9~4 2O}r\k҉"ڗ.Qڥ_N N]٫;6Нj;i^Ƿ9yrھ_On#hgjwB9kyVT+on#X&J40Yhi HH (;%+5-"$BJΡY4:$NBd 6X[J؄Jf,Fv͸D㌗BQXB 0jUk^Wu'۠yxi8=݀OA3 4v'@ kf#9Yh\*/I3mc"^>f˻vz=5aT6Q 2y SKQ !.$G**kKHۗ2ɘєuIRQ5D%U0рkMGu(^:#رwTG*տŎJ`nM_6qrϺ JrVC2"<KPv{bw؋o;pPLDG(HĺZ`CJrJ%cMn}JQ;IVigTU(sP)cxy@4À;m<8v(Cd(ĸ.(CS@JbǵЦGMeW^M,S25sY>YDyvz7{w3QIkˣ-Ŕ;(",Q(r!`q됓 hͽPɴYPj[eM1^|@퇷M?Z1)eL*Ain*/$mrJ(U(Ѣ DEr{"Q\yl*g\QP"C߮FR` ͊RrM$jVP)mlXGE,D ܏|oev$XQ\Pjx h! YK8H"T{A*+H/Jw#CB@v$s,rCDJS^#FpdNG`n'8cBrt},,pr L0f4uh3]ɛ񻪉T7Q]JaY:Non^T138}N@cG}t.B1Aګ~WQ>vCfz}^п!@[y=g<^ (ō]ǐ݌=|(U;o;kg\yR"a]T>!\i5'uwOieXyZ,6·coE̹g(QgwllP)bh~* 찚4$ |.VY$7yF8N3'^"S4 O)84_-I#yDgcN'2ip 9X鷬%٧T^,Ɵ6u) 2?P#a2Ǟobɜ}Ќg5ҍ=qEVȐKk*}ɳ^W;ޕqdB`ڪe`XdlFIfNIkTDJ<ᆵKfYcf_W~};ꝽGE@V!Fu㊐y/?'>gّ'3īFWE]vh~nP㠝{SbmA?yK@/'g-h[f77z;n͂ Geh$o1 A6E*8 xЈ'۱=)jrF^L!y6Qɤ%ʀԀID;ybe` Fu#Rh.|VuΊĎ6LͶi)y&h CZRsh ǡ$:M$L%XRV~Z==|-IUH]qf+I ShD묈8w F{EdYg%NPg~$F?qW9!9LZA庲E¢N|I<]SyUz)y :wO_ A4kDDȥ.IĴ*Hq(.n:+ڣ om9ɮ}ǫyz%|qʾVf%ݚ5oNhʀ7(a?N=CIpɃ0u$ 8:"Eȵ[H!U9rr™*1S֘SQw> Y7W_le].%>%ԑQ:v:z.|BI <XC#w&@@oO(*7F+›sP۶jvQ)WQg77OAwasy7ql2?MGx]|u~Vf۫σ=LV{[*./;{xSHsޏ [׫lg,>a YElމC'\PQLŋr/%7SK͹gf*9x,^$5f{瑨~xw7Lzdt%D&07{x[aJ]Ϳ4ћD<Q{=KFқ+3g+VNKQEżLE8ɔjR+N?5b,GjΤ>޺ޥ/yӋ@F]abE _t71zmF{ ^_0 uU3Uj DUVRkTet ȻZ[kQhWK5YwPkHD\k5(N AE*>a<( *$`OWIolVC}$&z\NG4H6F #gM_4Y﹝ToD8 SE4PNxb %ͩ)H%ur| A:I9I q2"\M'$+L&Bc-;LD4+ĕ?Gb};Ou>g84ҐK~$lM`TYgDcR\k'50̤|&z 2 J! Xt Fb卓հ9>ISVYN͠DD'@ Ϋq>U_|V[A +| ^%rֱ$.8Aϻ#RN#-()AOV>%sdwGcH蘬pUA!$88V2"JIB8I@e$gSG'LIEx*UPp:1S#}jy?ܾat!⚘:y"@QbN_5ns xS y!rQre/;xyΙ!W'B%r4 9%.)-4S 47(THL451sDB@mrN*daxY-#^f-zly|[ń -ٺWIXϙApcf7LXޏ9BK\ۚȵ&sNY OgG Sx1 8dq9 lh8z֯f=!pdpӮ[ޯfM@(1O|b-NEwQpAYJA"X d  AiXfTԍF]yiԑ@g1)PV*N@m$Rlv=qe/}Mf!;nĺ3rD()xO%"Ϥdmh-TԒQ X!κqv[Lr+D5~Ip'gW@"g?B,QQrL<%(fDp)[G .,yQT2 \1Τ!9R yVRp%cTZJ^XΊ_kU_;5ϥC.ՠLGQ?*X'dHVkԱ$xڣT0NJӷBO)AAHvIV#HTnB%B$IV1O#K٧Ih=DHJF\1x OX 4 D;QDqUGb7xT 0%wJ, 8 Q*5"Vt"F /AG&\BQ'6DB-ME 1Ax_#U!2}-qFX?ִ ft=zSAA*a'T(q\=Jr'I28kdelH#<P6pJF}raUPV6lׁW?\Z4:bcx7|xُ`iN!ħ_P䃽pE-#v~/*%\ԔI‘|4Se-gX"g;p~~sNNgDPgVIh3t_E-˟Vͭ/ZZ74n\{&Kmz7W~?^7}}3~|&=ޠ㇟{4I^.2ñs)zȍtg? P/?L{+M7t>O _?v.2q -/󡲏ɘeǬK?<C)q,T,oJfVi! QrPP " =Pv7H1hښ"uI{ST)0B<)C]g}rٵb޺y}I v;':9JJt{[/X|+!%˹"X+DrQZD)}ԑzyw^Ϊ }VCc$RkIsxz2)x"F:19[4}z̫vkQOG+|uf^+<ٹ>@.F؁SE)_y4X(~~GJwP;+`0>p)lC8CG yyճբ#6wk(7/knO-3xo;{ߧvw'+$N0O's='u>ޫ۸\q>"LҳA^9o !O_t 5f^s!!7gfU\LHO__{ҫSik]K=z7r;YK^ȭ)sJǜ1~\Lm&@2T: gZ{8_!ef~TuW a$X`7FO[kRD:}QG"[驪>U]}.: IG~ JVʐR"c^ W ''fJ`1hF `A:w*2ٓd_]/>M[_|u6q"qҒ yiDO}R+kQ*V⮪]f}fsW])mRX`Jޣ*^rW,0 :wUŕPU[lﮪ](M `UPU֪}wWUJR]Fweѹ`g=2J_O֒5~&錇#Z+hm 7IU:c0B%MNF# E E!Fmd;X,v+>WZjCǧ1+W0ƴOw<]asOsZՠ|7緷mtp \z>te~A5`Jɡp=zP&8R!*Y~CwУ6Hy)o#m6RF{X*QV6RFHy)o#m䢍\6Rf7m|#m6Rf0)o#mFHy)o#m5RFHy)o#m6Hy)o#m6RFHy[^×/uG8g7ضԞ).YUʼnطu!Q,R,DWn џ$vJͻ"zfi:aƓY+<_O8T,ꠜHZLA䊷h,BOaBH9˺=[E<1:˓[HɒXRz&kbWtvP.ʖ;($O{ YeXKyzO 7;sɷ/&]>}ݿ .O)[R Z7#?Ig}<ƜYoVoOsWym7fBz~g=ގ@o7A!j§q'uUoG&5Xs/lt%_dݣE-~鹏'<̳/Ul|.}u<^}kz1;Tݒ"Z^dNx499o~gp6:ȏn -Myi=#I`&݄ ::G^h㑟N0/+| yZ^,dBrFv9iu)2@@ 8&j#%zGQ!ylM!#u`80#T(I*Gؐ ǝMK4hޫچt≵ mX6E{?][E#%PZ7]k n Z-nj @8'C9/j rAYtA+)Waj4~xE2U?*#_4-:ur^ %@V $ DlhO8ۯ'sXݽ,[9mk>2voǝx[:kf9,ZaLkfaE` E(qG<f횧gw'6T?zjgp?/5{|]7։ lʫ:q(XZ);ʫR+5CeԳcngL]:P"ޥy,첄B.`p]QBl&/Rң7*SKl!pljNQcL1$L ;up Rf.CҠv.V-Hmo[Y˹\z:r1N֌FhƟ3ZF߮IwolzgdSm}qjQ5n;>0{*YC6ӛo^*, f=];>vJ-U垀t1lf(25@oMuѼ]O,nQw2n~{Y{~?% rc#ZGN o^LmފSx tF/o0ռ>{4̓?@O1`J$)vAQ]/l(Z +ؚtAiU&_*U1{\Q28Sb."Zc&;MZp7[Ҏ}ؼvCC%pʠMQgn eʊ1]‡T0n*|)fcƮۘ#)PFRtJ5B$&i5H&V{+!!llMfiҞž(iP6'I]ԃs~g;,ZԬ"i:V?:>=Ue%|?@R!:q;"tՉE*(GBzRw6<)e*WJT\m~"[}0"~)P/'> 6#NZ8..g}w^xGTJx'..[V3Uf?>pW=ڙ8ìk]+^$x 2ː$F%egdoBd.$]/Lʪ-R,BKǶrɽѺpqCX2x6( $re!,:d _\2Nb!iSV>Ph[RЈpn}>?'ҮM96H#H~L 3 H_ӕoO}EW^;a{ ۯ<=c{)qw\xQ]<"~`G4_?1llj,6q: 6'=Dsuώ̨)m:ލS 0(%+E *hAjaq8x[M}KSqf&rηG!'T)T&s"֒/QY'm aHYdt)c>Sfrw(h뽱"iX#)]ʪ?וDm8Y6$ &/l"9TI@/kR7vF (.¡Nv(a9lKQ TP︛& mCfA!+)HJB !VAzژb) d%W)dָmK`v Ы[^˝+gtU(KwP=].~7۴q.'Ӻ}+i4M/OjlRMɟ_(|] Ż/&]$}Ln7J iD4Omkf9)W/s>|M7fP  sz[d}7{i /i^ԑV}r>FW_{n\M=I5L)<>gU:ku s/|[!u %\%Gx=Ņx_Or~< j~nPHML8Nj @]{JxFiϦ3>ǖw2a]Ȋ>9Po˛+W / SFpyc-۶ h)$7oi$H-I<0@*栝b\"0Dp{ h lN~NOlMqFd8$sxSZpz$scu'Kf(47)4T/^3Ez)7{7];6$C/(o>Tr͏hTY΁q28肒O4PN>E-ym96gxU,1(Q'Lj]T*y9c !P(l['awL m.D✊xᓵ j%&IAh1+Yִ͆&qL:x*~6{P'՗I?7=ִ? Ǧf{dg4atlƴ6 {l(`pٍ3|m vJkx-/@mٔ~ʎ T'boC~B_@9خS@y|5u.P^G P{Vờ^ӯ?Os: wGA]FWbMKW]*G(tʟV[%^.х9\հm//t#w罥_a&P٠tL1LR *)O)\QW1}䲪yc,e7-G gO*!j<rX+|l4۵?>+/^mqrue}SVې.Nʻ~3Z[fmY[Yww7ṿ][ڂTv$Ƕ1 e8nY΍ZиOoh$=v#YЖ v AO9O)8ֵN,CɁ޸ 'bpIZ_=esBhYWcօ(\X+ P@ !'pBT̉NRs`:U lQxRw tת6B-5s,6v0+,o%ٲǺآA3f|EL eBH2 a@*A["g;CTmM^/Bo]@=`Gۻ[!x:S!B5ϊ6zS",!p(8m3R(S yzLr(ޟrv aq}`Ԧ격$fPt ZG!8Yn*g# rArY@tZuE;ஸ3&d2Y3r喇¨t/5b-@IB?$)ؠxQ9ұqL-ZMN4pHErykB"WSw9SJ C ї%pQO2SoV׀emgt=]]v@ZBZF{U!N =]1[4/o+Kmq@}SOQɯ~+)4#a#?:_L|VȕܰQ9EG(o!3I_vrو_~YO?- ]` ]vLG _q{JBMbmvb姕{֗E~\=kgGWLbՇZ=[Ԛg7l|LR{g~]qψϾ;}Gliel_|ifާb{t=yR^v"dA@Z!X4Κ @''^'1*:YL2Tk?&h! '\&9ql^ *E% 䢄 )EҐzvAf_#]&r}yrpI3r; >225_(` "28nD8nlYzwS3/113\Yڼ]8ݾ^(RID_#r,{{DF"@Org*Z.)M1W(,܀X书&:' hw?V5ۜrbk/_?_~[|>y"ƚϗo/k x*'9203(J1v /x9˞KB&dE 4rm I;H:s ESxi3-].89B}(p% _'%:^1lFΞ d>֝{pds2g,}Wn9Tӽ>Gw2|Vy볂ख़ sU:/[IլݹZv+uLdПX*\Z3jݍfۻvtzfw6uH-dKnݿmCwwڣ絖/louwI$}/7$XEѨNcWM=tԆ=lyC yno~Z L.mFż}ewpBvl~ȩ3Ju^8xdSiM8g"+םg=bBH\G +)-kPm׉٣=j[3ڂVX+vlr! w`wp}itt"t$I߹:kǍ7Ll e"@C6Ĥ л} uI*Nj<&PqWU!> 쇋!3_IHlV%= a{ch'V^~v^K.% w]^Z`>Qذl0#Gzj};¥ke~?u )*0klAnz)7BQ&6DIUR()I$) !d |-A@ =sy~x6E:Ѣ77ST/2LnirgG,$ s #z ru~b╅~ahxaZf]D=ˤͣUD6-t\e-"b-'-h |t/~xA{[}b74#TebfąG&ai>ַw?l8r#]1֜Ue#_oNq3j?OJ-dynDZ)U杶PiXW"Z}}-gneC|'۰Plq86Z8i6UCYF5b9RdoUkt0:Y\&Zy"mLGS<'&gXț뾔n/,1֢! ¦^=|)CPkR,Do K%W[)Rս/R0dY͡w~^'j:_5;/NDeWt6WO,]59*2u_OTyv3&/".wf1=)'bH+5y:uE<*QKg(Z;(ڗ~zaԕ%QWD0VQ{B}ϢR)TW7Ѹ~wŻ~6-ODzM,y3I?D-|滚sbfZ0ͧ q)ZnBqAG\26U ϣkmy}5˅%S.jjOjC`Ef$VI3J!k99G>cZc±`bL)x4 }*)WbS8Y\bÜ%Fv>Xֺe i/XPD,耤t dq>ޣQ|u6ے3#MO!Wi%[13 ~'awIXDɋFO6' zQ؅%v=V!<U"vEu u"4:Ep`K4mKwɃ[ee0j "&K9N3pFD5')ZlE8WϮƣm{ J"ydi+S]M=K+ާ̭>Fc!)|J&(% 9cn9%(bnd Qi'%YHɓK_ Tp!x]PJgh1h(JB4#g)?BAR#)f!EL2G"ӶX5zOޮ|6kJJnBt+uօ(\QAG4 8W9q'-9>)@r1 c}1A "V8)px`K[.s@0V2#X]ڇ!0MhƻnY _g>c9Р2 8V"qV*N=D0CG psAy=^2cTņ0;jVuN1|oaZ'5b= @IB?$)ؠE-ˋΑ)Br. "Rn /z/ A*?itrD%aDъ[i!@I|2͊iZ͠V?i^1X&FP0޵6nd" ud6&;iu-S(S$%KjQ%HmT,;E2e!{(Պ[K.2|^pdgU?MXͽV\eG袢Q.a[ČXbFK8#l (0,2Ts[>nԀ^tu.̗De>摟ϨQ sbJ-my!#(eɐHR)Z$8z 0V\#vwbuftٶuz"RC?dHy@Blr` WG ;fS)GN Su׹شOBt1h0 Zp5gCKTn4h͙?ly[N Âs@I PCz Y@S`5X@UٍF Tn=@MXw#_ tISÍ7`\| }!`mZw5x׫oorPv`oJgkJ5\]<pavhc5[j^Ԏ%5J{Hu"H%`jq>{m਍ۅUGY=ډ]@s41еl{u"% `b,L 6X.!`Rr5` 5c 6^;05hހ*ErFL<Ұ !~".$:j>?QtEdq9E|0lЋM;2߭󅹛4cZڵK|\v 1D ,*$ )#63d3LbEfuވe1VYkX@QAsglrD`p'˶g}|<g+xjq Dgul];NL%.}Y5u-S68a=n03Y,  ;/io=co=Hӈ{H-X܈(IKD`SeŞ9HbEB#c\UwQʪH׆77y?_~Y?գ\=< dXS^܍eOq?瀗k,RZ(zP@Re/{xyGL;ku # ;D҉*Q*tF32OQKnAHRVa S띱Vc&pNc!Pfxm9{N4[lX=tlÔciMsj:ompQa汚"-vgzi=g1uz%RX#SR2RMuQhᏵqGvBE}Ε2zt/g^7ϨϫQh zcWZF);M_mL_jHJJ&M?MgQch+yP s maJ;Cl2|(s\ڍ\i{/m,oadY7=ߜ%`"{&CDT)RR(CX,-,xGI4 GrMR=3|, pu|<[Z0batY_{-Uwq *{7ƩC$8+ӹG\`P{B {X>AIT_`gߛ? Oh(^ %p=\|67 fn2Xiepɝso^FAN"J3 Kg`)"tq b.LU[;zfn/HJ`H! H ˥CR%Br#lђIJrV/㗱DD'V=~?mblT%J=%6 h],`ٜR N,놟SktjL}D#x֯zk&@@@ (nz$z%_~L6@ nUE%]])L7 q@| Pޱ߿yi/Eˠg+h[%g7|"r7jˆah$9W`&D9): 8`M׶YA p!Y)8Ḧ́b`0_ ^“8ZA18;8*+METPvg+Ͼ:C:>m0Чӄ>.zKP;6h p`o?6`9q?ӜBWX`B1G)I? "m"2#'+ΩJaȽ`s=z0$}jJ2O&dԡ&Τ*FyHп>㇫66: =^w~43xK_5hWr˟6CJрmPYXJL!0"~ m_#Fcg6Au dx' $e(S82}D,yEu\?cb<}!ךImL+{|) Mkj5XFZ LiIdcmŏU Ԍ[pGg,ZĻpi43#gye9ka% ]˱ilefh pD5eS5Vce,:uq? )np0EB,, vAK 4gFiv^/ BNGK!zƌƁZkO(R_wj$E.+yL&"ibzhp %:!`)x$>[7S9~(;i5=WcX?١?.߭^f~%sEL׋S$tҐn2CPDP*DCg sCwڅkʟpX?]zQkj^ύ'|^zONe&#QHYu2LwZAdcZF hFs+%"[kw }Y)`xD ) T^Fn NGq[8CR0E.(qCvYI( 8ŰTk9Y4\O&IN^5>RL@Gm7 ,3AH喌ٖ[2nG)'(PY3B£LXT%u}sns+7Uݛ (h4~ 7.S  07V 9io6(nB*lC'ZK(IbnyI4`&0ґ0хT%vv\%lTrQǹHmKU iU30^aA9A.LR62?][o9+B^-@czf0Fcfe!o#${X*ɲJNݎSENQ;\0+ddGfYoI1crƴr]&"рHCtHH&Fz! 96"mD1v;(v4}[TriE^E>36fi1XDJwt״Wqʺ]"LwgtFqx4[)eh;8)?>cf XzSOiH^ɒx^ u%=2$2C,JT1*NTz۽v\t:\}-YZZ0x,F%)vEeɯk۾뢷ѽxAO" =cg_Ob([?'ޓ@Vgƌ#oÔ}@o^l\G>\7W7`fb9V ׿ā`7fŘ8 T7 FJY-N\D38$`=`kLZ$j F%2, $D$MiA$oez|_=CgvvѨ5+hc.Y,o_}٦]hz2ߧmŘ^/'rAx6"Xrܬ`IK0(N!*izG9OMNEoˁl9yn*j\~z8R{gg #}F_1z'-S\QB;۪\czP+[o*[o--oɴD|WcK(x /mH%7ͮqk]\f (m"-㽌rIA:k>N L69Ng&(ŅF|B &2p!k yZhd]&9]e F)k-W𖉭GG|X3 +ȥ>5ˀNZ\}YƂӍ!Е>:=y95/B,Y2LY*5c)Is@Y*+L<+]ݼ:19U-?]uSM9ܩSwZx<M6ͭ/ˬg"xNECݾ,Fozc՘*l_vBe:n[of~{n?|Fnr?0wއ*_;&N6"[oib/ۢ]lyM<|??I^빍|sa4}sB3c8|18Fkl|#x%Oȶgn>CY<-(xS=5ўs̷5d x1BL D4FLDrFnQ{Am|"08ƜF$p Fr"٣T.ϝ49`f-UN4ЫؐNϮ^ye~ 83|zsQk`ihPIc DD*A6(^XZ # Gxx+ =N"w6XF%c"+Ɍ0E? dep4Zq'۰L:1,Y84ii0F8] Ϊ`lIr3Xɥ GQEpPJWK [d{7ߋL>j1!sVr&f$0Dp)'靱Q1A(n%-0/#mD=t"UO' 3H N)Q6 K@s EYcK2ꢣ?L4U|?rD 8Y$&' 6Js>U7Dhf\!0šQ6oDqG*ܣJGD&2nע/-ɟ*NY&A6 GA5kDCM4i+_`p8AD@S//@ܨ웋C]8I2ɼX!5ꎕB<. ,磕>s}uk%LBOo7`gWwq۷ o?^;m7 {g_gK?>_§a"`dو7 7`Rfȱt+wh$9P \A(E}5pSMg.3˷0';,~#pLl-hr l[^}e/ವGd>#0O zDh.N  ǍSjj]D}Jm6_.cOd%m[CLw^lܐ(k9xeU{[eHliEG~82 X?QʵK <2I]erUrh n2_IWٳǩg;uM5e]aסWrK./+}(gfEぷIβ`LZⲯFtRw5*ykoBfU21%ik% j/~сqĤ&KH[crKX2k[e3SfG\z2brYGpxe1KD2D[73,@z(v>-kn; otO~LKqE>̾`%No7'X^\-^q}ڵԣP̬zv-y6.Z4DYy$H°Yz~a[3BES"oDBO"e {,@ {|OnL>^>`E-Nn|!|& 㞄6ѤrG~7NVnH.&a cќ':٫B_L^d6i咗M73a˝֗˜R)G0aZFI%0唍R<%ОI6Gƽ*{$g-pז+t^_UqN䌎'9qiT!3١E*!aWҽMs>ha{?dSv܎9va;"Doiٴ[K l 9r%~ԗU )ϛ!7k )-UBL6jm2VݬJPE' 2Iٜ&-8ƵY#37_X@%~ [a{Ms1{,'R[un ?z'2Q~J/5- 7+vw*Z9í)THA!RE0l )fDU0ZD5H̠ 8\(Ie+wo<&#Aa0X;09{t,BӬU?2eܓ|f_&=f׷~]InLogjK9d^*w^hTuBbVV;fZ1ȐT9L9EC赏ҰdIC`e۞ІYrA'E2]ڜk9W3 ڒ9%v+$g* UeY,ftpÎ:1JYЧ?ԯwf㩞ƒB0|HRp/'p%炁Q2ǰlsA ^ڨUiQ*+1ܧąǃ6H$33Py9{FI2C;\ؐJb_\#;J;T,{tVS5ʬ 9.(iTu`6ǵ C j+K\tE#UL_uDleG(NeIBTtpnR9 ;){}/["ƽLiQf{oN-嵖iI^q}D,&<5Iٻ6$U+nſ< ?ٝW8I]mQq*$;yn&?L~ZVrn/&}=.~l7]~?P 8 Cju70#bO}[I(unv>Th -n9;'AA-v8^y(E͛=rWqh:Lj=rg:S`L2.IYeSMqgŇg:;|5DoU!Tu"츘ipH$EN1kWRu HUG8t# ;rd*Q5M(M~6_K-H8j=!٘D鸞\& _q"GcJYT/ (D:N穭7E-spg*sݍZnO6>L40G2ȪZqK'<۫ivSwnsM)2lqBQqoZz[fy 7Sy keoS{ʸo-NQ>s\ئPf2 nN>n)ʁmẒ@4iLhCmMpJ;.x_4"*=zؔ|9 +hQɤ%ʀԀID<2qF#uQ͝8xWg<@l{%⇺۰6,9E[IJALm*Xt.Xc8K y,A6C}6^$ zx`lRQ1/9S קy44Ј (!; /*b!_臔K2Aqʯ:Q5 $B9 ѐo؝ҎQ2ΗH]hfx:ED."8O`]c9vzbqi<+xpb-G/ϗ:A~z4ALW jb}כpa*EcK)|BTit h"+yQk(:q/F\h@S AJ;)=I&d B_76H WƁ> 1jM8hą101쌜{i㑿>X;ڣN D8 SE4PNxb GF'MND(qww95%p$&d.rv7`hoN(L|Ic-;LDǽkj#h8Ou>:4ҐE\ D3ɰl*)|\j,~!@" 3)WqI' "l Eg`T*e9eudUuXm-N;zwsqV^%?z?|t~T~_]ȕ ߏZȓ;fc\Tu.zTᵜ-rweo'ֱo*I3^tֺ?['mμmh*I^L즠nõnh͸[ՑgE:Z4)I;I(!>Ngs"_YF9hGrcq\e"z^zMѪ}Α`4哾Oel8"(a-ڒiKĕ:V:LxGiOAwÕOj}G]o;F]8q #x o.SUy4;-&Jð=M^"i_dX֥DZ̾z3}P\NC1qUg.79\aN<4ӵ9y Iwjc$S ĩv!ă2LIfEy:鉶6tZfe@R&=lG[ xe=QTAw>[ ˄vsLF`K7#Ko8 K ̒.妗 qC 4031CMP3 Q娀oy`z9Ԋ=m$z7oD3l'm9*jDWuy{-o.vnf諝F'OӨwPT՟8;x`{ qowYY`M9?uZ;Uٍ3TULmYn%z:hrfv'sk@07N"4P:3|z+sFо!@[tsWwhOX70 Z 5 qzQ{QB;X'7{\Y)7SF8ub$;$ΊuzeD]|Zo,7꧚Z4ϊu6vCP`j;Z"?`X3?-~q rl"Bќ kW&8^e׫rpD5{F%v4%G'/8VU؆xⁱ Lwƶ8)ڍyX :1X%yv֢ ־[~ &PPԉu\FwσOY/ۻ[U 5J"rs" R$bZD$8XT`7XA o'ۢC]pW;!y]xPkmo?{Ʊd ᯛQ]mr/pneAOK0C꒔ey~Q, 53UOMW:Q~wM[֌^Orҿ;bY|5y?qzk!#S|?7LƳ\pZ|F57dʐ-Y! 2էnZ_y!tW64ߌ6.qϷ5߲ޚeH;f.qN1 w}|[(3 W.*;ⶭs=F➈" TImiv+m^,׽]4:zVNھ4ICt2puە:j~r:-P<ײ\|Q'g m!Q7mxUĖ(3m(I6髂!a3Qf69̂ʈ]݈͜ǣ\vfD- j{y3@d1g=ֹH ,*%j*$ȳaVڸQq !ʢL&&HBc*#sUddٍmR, ,D"Gy\R0Me 1)S@$\>Θ`"rbI6ːрg|ɈDR#!KJ5^eDf Kpu]g5/9+U; YYO]>CP5iQw/ OvZ>VQ՗{ϣ'r,NnJ$uV!7i,QF&I%uhCߠ!)Ip`w̴֨,2,*AIE `reY8 H#E팕z#4I*Y 9 E! D)}u(&)I۾ b9<#lkˠui^jN Me .fuq p.*F9 Dـc΋eI)8̻_{b/$ӹE/"ӹt.Dd91L)DȐS6+N{\;)C2^?e.ec/xf>YgO7Lxr{v^[Ds~W凥S9#oavw~|Smq`~=%'/X>">3s}O_0ĮœmU һz2.O 70ëgj-+/v( `Q`w%xԪIF-!왔tǕTjPtjPt2WH u9'ၑ&D_L= F-uq)F=Ē섍**'#W)VTvUU*Z\ܡHޡ!]σckˏ);ʀyK39UO8nȴ:Rg|ajF+Q0ۀDS BT>yn V4 MTM5e6;2d83/ )G g #Ciʡ<yfjgeW,Lx18bմi"嬟p̠?!yn98)o Ȇ\.>n7g;x/8ڑa2o9r\T;N]s7?QnTssX>VZy:80RsMyqSdG퓼k9k&ܧU,5s̵R2si 8r2F e ѱ.Y4vJ^cdY!H`97^d UB>ahƗLokk1Xѡ";v|DeP|.\?3f-hyT&G B1iw>c}t}V܆X2LEpeI0ڴHȲ/%MQAV2:%o5T8+6* -ȉ#7 LX&5YÌ)'䭬g3ڸڥ0Yvwe%qYll鞝"*A! b(0} ߋ/z>sGe}hC"YIfiW :g5XzKM,k{Uj }[Lcb!YXm) a18mH\E܄ b84ͮ]'Y>t CFəT9CAxr IBqR7Ϝ.Vv=I2ߙv\ೕo|=AƘADYJ l)iC1L,%"r><NJu n590:$󽲗27\&F4!_%dYDClS}- m^-\Y* j+h7Y$рdFAS p_.rl^܊^`AzԎ wx?/Ql o'Z WZ-a+7=mi4mXӦgOY$,Z]̏cvhעUSo/ʡhAZFkhJ{Ju"JeJ[P*ˏi|V(^dH)}ƺu9`z}GiUgY=ɲ]Oprfk}։YMą *;kA/W;*s99el@[ &֬n*'IGUN4Ѫ FϬJZd,`#!Fʊ ]D QN2j a )K>&e4VЄ8J$FC lĻ\M_n;Tx,ˆo/؆o.;t_ ].Ǭ)ׯn 8IH/6"VTӣA%nyV'm`ʔ9:%ìu% US1qcPAKGA]uQ͜(M}g_6s漢,OY3Et.^t|542PC3Bn$ONeg}g%Fs9ʉTt?.AUN(Y+WY4L0 *rۄIXiSZ4 Or;䣧'ZfΨY1ذpVEq.Ό`칽gJg 95d,[{6pEfWJ=\B0gWd0lઘ+չUvMRpU\잏޾eZqzv4Qf=1\=ZhiiVl'>lW>z.W\:*"?"k]bR=\BRK#`PîZ \k-:\׭ zzpȻ׳ay}0u_V󼁾ŷtӣG ƣTQgX G[s8]?(c~L?]^]ѿer[]8wEL{#[D^OXX{#{x}݊#?qWtaZ^69rS3> *(|c\VqPp}Uio@`ۦ47 w_Yz3h&)Z{. Y˟{X)mѼŒ86 6/Zu\kpU++ЬD][oc9r+<"}ڼb l0 C F,)lwwŒۺXl}3CTZ@m~uXjq`>`F; @ *7AJa%3![K *i+K @\(S(e-_L֣;I[aYQGQyh4^-\?r괅/]+4P`z =.!sZeO/s3Gϐ%AU$mSiLFA4/\T=dr*Si,QF&I]zO(w5O BlY22ǘe RePAEãR Rl3uQR zZ^hc$ S̉aN!'@^qT` ɵ< Ԫ=BgH̴uJ;'LI2dzX`m͔[ rMq-ڲA&Aׂ> |CA@ehL 3Vh9 2ʵ8.)X?pp.ۜ a]|6&8 _ʾP?K3kP/6=t+aڠATsoSWӪS׵s#r|>Fcꓒ??~Co2͛FR0z;qI'>J)HFRFZLV` g)2E乕.XLOWLXmÃw̷זqf<&M%判lP0ْPZ^DwKZe =rRyo9@"\]U^/tex^m8mPgHHj3f-hyT&G B1iw>^NsinCfӟO;=mP:JȲ/\AV;%o5TU=+c(%B[IG(' LX&5.+S2O[YYϪ @=k98NZA]T\2KBG7E24"}Ud`Z8`Y:D,H+JP5Ug>Nsɏ#,f&f^:g5Xإ&h5ԪlIa+6Ӟdb 'CV[m 3Z A2NWE1t"7!l3U 4n4mcvc&[{9Xˮ xz{r)ډx3pb~uMXQr&fbJ6$5^1!n9]购މfjz.WMDU>[Y`2 R'BUJdsFONB% YdBdАY"!ԣz*ՎEfZ6ĨI6YŇ}~\-('ˀ.uM43dZGT\|v`!tgm薔]5gWa.S?粔 ~޾%-;Z,gZ_ט Pe$J |Dlb,}2 ୍a1a$GJnu۠<{VBWi|M bk}ꕝnԻJތڕL-$m~s݇;G?6 Gw-ߔG=B#Y?6_eˣ%FOEԍ:b+Z]kmv LWT%%᨟G=*Ho]XU^P^[:t &o toz%%> W%,{zWȧ\DN+繌 Cp DК:Cc99SM*3Y'.%H#eJ.tBhխ'@xl )K>&e4Vdql ɕPIL/ζpd9xdE t2 s_' ~v;ن.$<q7y5ƃ}okQs‘j# Θ]nD=*KZa0mu  LΘ- Z]0Y:=9z7FtT9QՆ_Y?-aٛկcc>Ŝtc_yY>}N%h@6ȌB96qBb/9e=vc=(y4AOĤKu :rBZ̢aiPZi,VŞYfB$c\;M) O{MAׂ**x3p훩cR{e2J]ewXMk.~xQ!.g52q*!6L5>k|!OAsf^w5gϬGHދw@4YB8s##lИƔJS 2,앇s@QEϨ3S|E-o'$~w6wίT{9s!iJcC+, v6  0!JހVCݛlNd4 (0hbŴjCg27--&M18+n"dA|LJӵi ^u96}a)>Rk`Jxl;n߆-qvZDݿm2 %\ Yf͖uzY 71IEK/FcPJR# HK<7+H1HpD )| s:[yʷSaeZբK$nQ@I ogxK&w_\ߝf>bvz4t^]buJJq01mCwNJ I[jm,} +5l\Elju;+wNVxg鎫eJOMg:.R=fަxLe;fJRxu!,W{ o;5Nt3] 42E) Q|Qe*uOg&KTt.&8m(}qvaœe.TZ۬ j}+CsbsEwPԴ{p:tpRlyW֓Ȭ7hEԡad*U4:R5ZS(mAץ9y2yTalYh=ԭFhWMgo:[P" B_'0JT_|я4>76z-Eٲɶ5>b aG)kc Q;m 8# Q¦33Ԗ$I s9[.9 ^X%CB HA$-x[GS 3ܥąۃz)U1 e mn>sB:s`LʚQ&뻪:t< m:O*ʬh\́ Jb,\\ Aj @"^n&t )A*Or)|y fP"B-4҈Hh;=U["L4b*!!6ΒmE9d9g&9ŵv ʾ$f!ffT'"oIKmM('ԅ V`GN> M"좍gD$3n";EHGydh bd=K H2 L@>.}mˬ:{Boe>PBbou, ?cpZN҇4 r3_!I aZw;6YsŦK]nO(jWZ0wsPsFcȿaX]w1>=᱉iPjb#?S2EϽtL%Z6k{|M:ˆϽa9H,g/iݵ\)I"S^yFL.?e1::k!wmH.#nu{[K./{J֙&$~!)QHJHZe3݈-#]}k~uun~tǣE9n?&9.~yXZ% {[g(V7-)Xvꢛ @ yz)N0P⩟0|:_$4]j=2r(D> (ͷ+SWܑו*ZQ"OR~}5?L}b#W5<|ۉWmj.%^UߴRN}N2kX \m+нYs-Qwc˩7w/J_tL1L=>kR&65mή7?˟x,lMiU/?@ܞZ/=1jơȎIf*̑Dp lȚp'G%G=,4wܫ؆timXwPC{y]ݱ_t%qb ' %51(g~|}{ B!4Vh &( B-;JC!4. A e`?J"_ 9Q<%`  FH& 2&Φډ?8q +p>=O9Pg;vYoc6fYki(9I-6a"k|xWO_ZM%y;`^,Fr[<StמMYi,\&o:ptdɀcM"I2lG!,w{ުXd|_!ٱ9*ɓ TB .aboϖœd ФFtۖjo2^=ͣ`<-S!&,?h<-:esB o }ݯ2 rl:y76ٲf[e6׸}Vݝ}zϕBm}NzNY?O+u"fܸ+X_Mnw~zD8;z?Ƨ%6Rxy.Gu=1&8*ŨtJXjBKѸoMﴪW98笩H+lrNA} A3/;mԝխG+q .IL(9 .R(,r`MR[[R22jEtFUHR(54Eq1S'e1HY`R]DY[Ylմ|~V/?%3=9l.s%vUqsv'd\ dk{4-fk޲RaST޿'3zk'fYZhmDIALu/0U E%}TFdjkؾJ.@2h% 2l)MN0Huwg4P,,|Wt vpy\ɾ74~<6/qĦhe-5EQPd|F BBLݴN[YU>IBJVGEAc21gbwOXW$Nv؝t6w}AmP{`4A'Jd'dHlNh]*IZ"P='1$(4v1Xd! {QeXZfɦ1c³Dd:;ُSQDL?ED1"Dܥyb>>2͉Y&>C]lm ?1*pYJe8Rb % KR:FĹCC2$}ZLKŎOwpqŝ|4ގ^ϊ'8 $dA$JR*\LjN<:cb_3 x lOme~vSo82DCTL$>ٿ*k_A5&H|=:TG ®ѯ_8i:Y:?pWD]Og$ExtŽ|凉K_0i#K<{N\|)Rd)s'Rv¿!Um'@VapATpo!f[oFnw9\vU|kg!x.javu>qQW-M-?_]:Z0ga6/0ǿ}X^&94Jv5񞝎XGfW#˳ъL?z>k::.SiXJdFW60 $4e/Fft1GFbv'lǺNxgͪ-7ՠL6%;hALㄊ 1{I(0YmϤy&wIizg:\lOzc=A)l*BH%*٢" $żE 4Q<2eF1UqTAd6SR ef$%Gu&AJ^N\Ρu4VE(HD.! O }KVA eZ"t[BAK鏥`B] TTL 6:ms QZE:r3$+>- 5]l'2X?!+)'s^B5՟RʃUf`c}1"!yaPH<)őtuiFB$A9XllcMWi[{$^uUŠN=o9hMx ÄxgF>Zuvq'g'd#FZ"4و?9P5*Ia+ŇUŞSU;=  ֚[ @Lh`E$#`as%FBã*og.=5|'x`&7@1b7~|>uA/CrJ7Xj,x2^r=m=^ɔ@ ^t L`3w.ɲX Cg붊ҳ"9R!8@ &g S (h^v&Wby)|rpm=Mf7]k!w6K홄W";;?e1O,m.}zw{wܝF/l;s]3ߵ|}Wwhy0Nw{I) W4|~?4G\wRo ۛ{<>?Ks|g͵$:ٚS5Q֢1h$d]H((@ANbZt| 波z,/"k7X^TTJ[KJBz4E)fb1DmAQmV|08!m7zm}יXg]1G MzL痋l1 I^Y+\uyA)~[Z -lH=(SʉCrf58,x3hKlE 5jJEH",:*C `E`! J'm6 YB-6gc'y:ĹvOf&3VQov)*_Kȸ""+!`EQZXD"&*j`]oEW@}~fLZ!P%-BJi;%1fjVH6#"̧ضb`{In"w|,HVVFl'bŲG42e3m`ƲfW=|XV5N8?]$92r[CJq6z6e(;^_=6pc0,ڽL"+$}.b2xtogL Sv3 *T+=f[4Q3682" ҄i[=z,OEȾ++"?.inJ$+{E@"mN)a;&DY%"YN=ǒ-ȎA;IȞ!Y]_;s^OĚ`s5i&o?nK$J" 6y`0CÂ7ʫtE+>+W魖ǞʫҲ.w8#9Xi!Fߥ L9WЅk*jAGjbl@(ɠV&9'F/TgVe:<\9߲Q?][Y3Y!V&63ŅZ[?w갼ƃ*ېVEC8Rb*#g*AR s !#"D7ZlM#KL[5>~LFh珈bXЍz8M]!귄w=<9:߆*>5A-woiҦ,ZFIb"vPwĽݸ+q^FgwJ; ,@nT8m>5#hJ kQȍ`އ I(e;m𨵂<ҪձemeYO]G`0~cku0-˹Wz^tAZ2ߔ~_0N#F KtKEKD2Ǯ[]R.)ӑ&e2D2E5 rd@Ysdt\t (bhŪ!&B(%]",Q:+̒Ǘ CvLb.8!&պcRq >Qs2~}z5;cPB/֍, p=YI)K>M-b=H,qnP(} fr[f":mVrϓL1 <י@1(Upu .. 1pk@J$V;5x-eʛެ~u)J ަlWt+Y񕠨~E]TdCOJtх&N\tsqW*'o= kb{9gT1 ygNBd\&sl)avA+Ywܝ{=.|Dm'>.3py6}-6˰ݺ;@mn̯T{رWbGp%=)JcC/+,;ۆFHd&Һhѡ#Cߩ]]>'!BM M*G#Q+3$ɉ֠2[7tLqC @#dW[i8JC;QRǗ1 % &$WE>5Lv hM_?7~:W/rAy,M`@%\@r˄,xBK %H$%Q$HQ:D8"Y>0|?Vz#&p>%19@iFz0`#kjAzN$en͡+'<`S ]6r9(RBպ[-wAs`Jy7"birWkZlZG(O<(%#U@S 8 LzbEAy O! ܟ?pAqfE*dPK<'g%k")b(ςNig^ҷZl]G PжN})m. J|7ZxV"S.E%lL,q礜VĤ8rY8EL&v0^ -۩+q<OUIz~?10Ƌwa>AFm)u:~҅<_th]Wb_0@;kP`|Ũ 즼+LLU k/.ղS Б҉-??}zee+EIP:5υ䙌 rhW.BHxb89M-lVUXkjI|dfe*(R R` R|2hƫnQ L9Zdc&5 <']FEbxXw?ʦ/wt^iV ~5fۍ g&`I^ke4Jaj׹q*I)d6nwS۽Z|RM_=&"DpE="Deb bKI < 8b)7xr9;.R' ~C@ͥ/O:3Ŝv10euF$ KJdH2״C1Z4R$}ּ| "*X!tNq5DkK}8mb8./a|wTU9[7j][eCu{zEdrڮ|mI.ȵWj B7O[=tob2\(~M(F$O>ܤSgrU&E}{Q9?wACGͽmZ2>_4#*y="nwN4M !%LyFlCLjmI+Ae7Dq{i hAlV{>#G%Zh2Nad(bPĂI؈F ^gLs4Wq4U"j2g܄7!xS+ tP.k .b zK0 [fK[Aֻ4CyxS:;w&/Y@i/י?h|lv ]'_p|bҰHb^^Lf$I=+wEL,I_l:=}&ӡ/ʔG$Z$c4tal涳鈎lIGǖskc sϳڳ߷Swov_ڠ /jq7sddz?D-WȌZH{wd-z5.{ _cWOuЧsocP(ZI#⣁~062`"7n;I/oQv,ɑ)[ {uE:ɗ3OZ͈|JkC2&9w}+/Nj/6,1?T籱}H'R},"S;9[|??hqC3oeWV6{? Q="Qui& yB|U;P<)&qH9@~6⍹엝6QtR_-fҜbRU@.T+B0֔mrYUd؃2X ¼2\H6V՜a{ӾťyYví|RS~yY|`7qbGrgMWvtkdZ]K YnL.0 2Frz,ݞJwj`V_ ]( tEN@.&ܕL[<`V*->Z0XL)X֡(UHؙwk#w(q}gY۸xm5Ԯf͏PC-V .)0XZ+K5:%rA8a AU70Dia.T-Gڶq)ZWU-;ݪm%KE0{6ɷH '۰ذC~UaچiS#j{R-&_5%hJ*AŊ䔓U"Ma>s9Pl<8mLV|*P$kK3GhB@.z߿W5 +8}axQJ5V - -ٽg a(9V͈~QJ:FwEޥbqhdhWhwvRo:Kȁi~?wHAd`3WfY3)]3|ہX-: S(}[#Rcy<&'ȴ]RGvU/3m&]e8CٿMo1s܄yhKVt>p崋k=pt>~[ì`u⁛\ mAvws7swuV7_vxm<}Ak±%%HkD5.ϗco{Cde,4qj3|!,/ٰ.@%wVOG鿿uN{(b *tϐw y9y1qfHʔY Ek@\OGܭoxm;;^wvuGUG3˼^Զw}gz\/gózxVa :Fs֮JIa|%q1_[iY,Xw97OgLiO [y abG=ST]N[ub *#Vh{xŻZLM,// 3ҫQsu=/G?o<* -(p"즩-\x¿} 1/3 {j$9VG>SFc9Cu!R!f.VgN *PБhK%'_SmHΓq45ň[!EP}'鏫/UaK4:f T$}B-W$Lvɩ-ٚ}@eP>B;*5Xup2Yg Ƣ #,0*΁ڭ96BKyTQӬ-}|,kך}^lPҿ$KZK]\i5_w ˩_-6]QywMFbH^812 cp1gQ5_3jr-ɥl}) ^6-VFs}LI\B<MZ#* 5g72*ݰ[ 3ƒ(_]7f gyp1 {<_7 S:?l~#6 sUeJИqN "svL]P7uWĶVВTDƚQ9(DL۩()>tFnٍtyQnGgW8Ɓڃn:LJ212TjMD Uj#0E؀hSp< 5|Pd 5H-{*;FBQ#m*Is,I5:aER.mx,ح싈3"ZE omϐ5 $G*YМJiL .%֔VʁDk!CDF:Vax0 AN{+$%K@6yiI+p댈ݚ}X5 ..P4٭싋w8pqє=H8 A*ZDB,iJ)ɯǂzDZjֆK}(_%2E#F U?t;g_9a⧨lЄlI7ú#r5d2v/ņ:EU9Vd{C)%1IKv˪ƒ@\c+sBCPNR0q13֜G,d?]G޺(Q3IUYSjcY%eS"9F]GLZQ_š75r<"WB֖ @ 9 iD) b|*HߴpP64s*/)⢂ T*E;$yil1钄ڶ S pdԈY<8&y}]1T6mT,Tb.k6ґYh^ &Gebj`9bgݚ8[fi[:>Vcs"HY0ڙ`,*Jmm@{+~NpPY9,j5!/ʆRji>Y 5NAQ|oȓ6;nXeL̖UPPY\MtlW)PKBP#8|uX?=?Lg}7ܦcR2VJP `(re%T jnu)h7#"m Ax.o|&..`K`K#Ik>-@qGŠ&/$1£_xܐ!CFzolz֜7h)i׿{lA$Zɺ aPIJ +gɛsy9.S SLPseP%x鵦VDCqzKqꬓ@p9fֲ|'m,$Θ ,x F8P!R>z1l31!Z?G+)ۃσ4qPe?8sX(9Unq'$í'O> ;$O"1J\1mV$#POB>6hc#vSN Rr=r @mbAƩD1y6ցHOza߱Ĥ-^ߎ73l͚-X5C=w~ 7wX2G-T4oJ`SUgi!Dɸ‰42ĝiwŲ7 &c7 62nu4[mMr%NтR /EJH58-xex<NӺSk]hsпyEYn,2K*i? I]2I*&M.sFJK.#E+>{#%%s"2qN2Iasb P"&[=_osdC1\E)'e~m'ч~Zգ|!BLN;;H|j#t0M) 䉱R|v sIYod^e^ixMcȤ;K\uڴ!@P=wLAqIipVKnQD5N18o3yn0ye `H2F8 ^ˉ_B`9o}kl.K[8~hN's.V~0s;ƫ2ݺ337np4\y/w6$WՖ#]^?)}1uezs8ɖNmW8UزĖL[wiz?gkr> _nMt~8Hv-̮OxփQo؝PM_-)y}q?롚AK[(77`:]S"wWxnh9̎Pǵ vqNJ OVw&+t2ytQJ++!b!e;#؋7=eS:ĬpZ łI{3 e:^Є.Z8Γ”Ga}cDs^~]'ہtm6{5ץY?j-ž~wp%d&z!"GB"5!,x*'t™2wSޏ^}TVP֫s5mRb55e4XsLF*֕њ.PFQ2g40pN;DWXNt2`c:CWWCgxfOw(%ҕM]!`ʺ fRv2Zs d_r6==%h8|2`ELWr3vC @WNJ('w2`C:CWHg*eptQ^#]QkEv2UZ0NW%=]Bb @Ju.#h>v(6JtHD?Fx5js01O_OUejTZyn|)"WJX%Q_{TtO.鹹z>h'\LU_}1+JXJ[+QnHi| Uоq/}ġ|Rny6ԥ6ٲYn|]Bn;NՔ?@맪:VXʎo%EGa ^nG}m\sܼIxMQx{ףMvK4Vvedpew@ޅb8mNnbRRdib*hΗN ,T֎H3iOϊGx7Ofa F15Х- ^Ck$ ӜDhGvs\.0o:X=+Ֆvi}kޗA5 ƫR:Q>ϓ)}nEUV8U<-s65_]LrTl2@WA`ؘR\S / -@N8UXr y%߰5< Of~dS+4+^Erkھ]. IْpJE}KOᯃUFt6MS>7&30F)}|1[q.UaS㋛8.rb۠+cGIX\ôdAUu}I{TV<&ˏ6E4Hh3KdZ,I_- l%Inn&ho8FUfѨYfRysV{sܭ'F-)gVSҚQ/F?v^[[㙊u>SiTOLeܗFpm<_&]'e|bJ5এW?-Z)m7@n(őU]f;걢G4Du2`;CWf]+`ytQRk^hηZbq.( vܦۼ)~?ΎH۔5S>kXx?8SY}}P>ơ](T_UYկ愒?策Ig]mM*AWzCo]ƻA hW}LcTEC%s .Ejh&r…U#/kNY3jֺvW 3^tXܼ}ofp3-R& .Uq.5/>#^u%}TTL-hE^@yb{GL;B=t#&_m^Yt|$!޼QyOI=9T3g\'-cTAj@HgdEpL"ImAz%Usor[t5P?LoO1ۼm쯲޾1?r+i 1d0<`+ J3ɝt,0ph!钸u2YAGFq p zƈ IB:& H?g(;&e8Hdd?N5@ĤH@ eN q&@9<c'ElƈJ5N1 aJR5rTES1ȹCu>8ʤJM""da$%FD6ADb6 BŘ(ڌ1f^Ѫ!%7IY$Y OP蘲zhM(#Q :AHҾq:>2J[ #M^B6!KgqedD)ϕR漹dHtK1)!yDt%B@iČ&IJ$dYJk%mCo@$]kd4;ldhڢ[GO[kJE$O:FQ&X"GVC$5%q%G 1>A`(C(A@0$`ў@k/z=Ӝe5xCNI"2e eZB`<2dLEDڰ,'ՙ  jynR!w m 9PbKW1QC+ցՂKÑ)˕CB0#P)PLĈ"JIکƘɄ%$#\$ZU( 0C]ɐodp, 1c,6V"HցECBV G\8! $p>]p^(#2#` C>WWX N] CHΑ\m(`M=j^ z`c[Ymu`_$EU`b'aՕGƲQ'#2NFfCӄ.e)5`d|5`]# 94)F|Ic%2H-:`!oڍY4#! t8f 0züKG#FƘ`B[11wm\I4uK 2;YlI@&q_mHH9q{-Jָ设n9unX `PgEHcoME"3|W5gA58⎺0LX;"p`L ӌ/%RcYLu>% ZA7L:GdUmj(lDC2GRF;h,X \;K@{/:IՆ?5+mv2R" ]Vkx5 (be+%A`+/z $drYBD&dY\d`{4 {xWZECk>840 xK_֭ݎb_*"Uz401ƄULl^vXNhMfϰ`ⅮLh߈.2ğ.Ev`c{2.D lӇǀ0BP88 (Y*&{@\Ch|pL !'Zej,(t_8g@Z PI"LjyU*}ڄLkptc@^BB$>y _V 5ȤuՅ 7t`uHp#+KP-WhFymV Dw*<*^n|:k {. O&!The \{#BK|t)}%"Q _@1`bm^H@SAʀvo1y`4vMfc4a䗇gFoyTbj8nX Ȁ-CۚbCD+Y,w]z`*1#K[=x"2@ʽk [ƮXD vUa+Bq&S<)@9,RQxyŴ ap`R;J-*FjQDFb1wXpP"Z`pO+AUE :csjo6i]r0݁XGz@IQ $_T4/Q40-l]΁?!\뜼<vRՒZ`KF,?u+zgjp!GՋ]/nϗV7[|GO(n*5Ulތ[~%} mۺ|u^AކV/0zq7X5؍\]7{Ŗ4RG/-_Vt&͖`BFo:wGz\^,ggRSrNmWR,HbpoIxqк9l/&?m[ޤ8\l9zoj#چKN+96G*{JJ Q ^++X )F@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *b}JJrOG s?#A}J +@_HIc%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VU)jOH D= ?%SQQ~J X J/Q +X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@_9C>g`Q\+E`%ЗzwBNr2yb"j:Ŝr ?fكG:٥MJr1)kU+;e[v"G]dYSGlM'kQN ^kTj!:Y<-^O..Zu\{xƾ:Z3PXtFQuPxHq^̧mspKj~`1}^\ v?eE,9wjC]t%JQI>$?`qngP^פ;SD~#&QٷYJba X,f0Y,ba X,f0Y,ba X,f0Y,ba X,f0Y,ba X,f0Y,ba X,f#~}gkWG'?ΨՔzhoubw?*j%+r|SanTOF]gm1 6kg6O(\`ԓ WdO%\.|ኬ ×0w9=6\`/"sՓ9sWdpe7b' _Ҵ+Rm2,^m_% >!Nޤ3xv1ͬ] 9܀ _zZgT]|TCt8h!we#LʰU ?}$ֿ@_! J6f QE- b+ ,U AN]Ƣu|0ʷvVhw{\zqt|q{v*|N0N /?z$^,cQ06!|m>mkj_jر:T[]BQ:Jz#4m^Q5o]qit{b;5D2ҊM "g·dsKw4/@a/iDB&@"QL6WYMO5z"J i|2Ksqh)!*jȤʪ(l/i_0B&<KØ2j+k7ۖIme85*=#G弮ǡX:-ZlF6vrg߼{Xr*磿.~}BD|:`^Y1(u{;g*yaRۗW9-ċTڟ϶v/cg>Ê:{#϶/S6'$MGr1x,JϳV_rAgk1<E#dpBI!P%JK]Nbb$w~ڦ[?8ђѴyP2!ZNC\ޚ[b 'ڕ^Ph媮o'yXD= ȟ/Ґ(O }ostA~MG.o{f䩤SnXjɜlL|j=+,kt.p?1WOSSٕѲy>jJ=۞oqF2y˞^vgtoBAyFa8l:_ hznmC:|"-$ڂ XP1ME&Vbt4:1p hlam@X)}nN R9ݫO-:nb|`]ltĤ ZۼZ"45ھDžmyuHCn%?GkGܛӡbWr>'ǔW+UZEőջzwH>a{ڊwa0?xxkD]ݾJW7,Vgכrzì.2unPgd`1ygC|^_L>F6=7^{ mVl/2Pon46 )U^GYk9]jzn;  ,܈b*-1/Chs'2y ѱ?=䱍OU\~,_fC.[x􏃎p~s^xڽL2´ qbbtj*O>p }6>Ɩ OH^ݙriM\;=*w!&ΑɈ{]61>= ڂ~}aT+٘"Gcƪ@ q!Tf_3%r+5ڣ\GC&aC-[Fp)\$eg9nzČK}3.=q}s,ؑ/6s\y.t'8ځZyV_w^!g*Z't*LߙRvWj6S\֫/Lg&at4㩾;B}ڽ5d{sq<\sj bq~8[p99̪24&r%dzo!jέە\,.dT@7uWm {Dƚ6We7Q0v*j Jjo}챻scS8{xc&%XU2ĭT]*KzyluEvVt>e A"dAZR[.*;SUqU:ncPsT׆_l\#1FnNwdCn($#7~Dg 0~N]M/Q< zq;w5!e u5sWU)zvm%ڳ1T {͖,l$@ |L6U@\]TBԙRBPPT~ LRը,%Cc#T!G l6BZcJUg>PR\TB$7.\2D\ʏV_SNYD;O4 tLX[0ՠX#F 򈮈 lq` ńF;of]ݲ^`当?n؇ h;PwDC,;S!v>^hEZ~4N/t:XՐY5%vضf $LvhDfPUq1WbE ;zd\#ER-AD3j:G%j"Y [-IuE$_nWЮBuʚt !j1yekonz)Y>E7>D3/և]VO]eU+?pz;w`bw;j}h9 }u& P[@P/fBDUnDU877 *([Z>o(A!ŰY>֛5fqM&C!Ek1PZdA SV%H.Q|%(`뗧Fbȹܹ\޿Kٰv% _ ~)fz̡ۄ*:lw*FS 1)&=$fahz0,gSߺ{+Y>D OZFU%T'4X-1$r-OE0gRM %t*PFRј~UkO$[-ܛu8G0*Ru%gD֗*专T e셴3\ZmQ[)d1vBf32A!/9JmTff˳,xO 7S5ft̔*mS̠"Sm#(bU(`jƮvdh;9EEX2L4xu"]*PQE>vM#l_S23t}*ܒ}l {$,|1` 3 T)3pbmm6&O>Ok`j5[49ʱ `U1c,rkg@P88ciUl j.W]tU!sVi)ŀXd qtqv1~n:D~"ә詵|w!a|2ͬ5`Tf^Xj)QYBɠ)HF*YlmvDDoeh EOB7yw{Lia+hR5[$h 5R1(c[Kʋr_ӒԔmrU,J']-F#;r#\L@;.P ZC=pM÷rr[fϿ-xtTNJ-,O2|W诓<9|׫߼-,ϯ;0V㿻g[٫$swvC.\@ޕ7@F͋:PXxgΐ UL ~p4=|-vk- b:c qԟUO_Xf5(k͠],.躻H덐Vhd2 `kF^8DZ9ؚg"HD Ukʧ޲=9'q^~EϪ kQ Җ0^C /ߙja_&Ob'O'I8+xIE,bTWɊ[,TAXRH(pr Po ,;6d%J:T-h88.Zfw` )Շ.N1&ΑD嗔 K͏Ji[ 4>]\} m7;ȰR/{ByH֧15ﵹK~Z뭚vX2ILySbNW3D]FU&*P.(XQc@I"XIb!1blB-I{o fgUͪwX7q[8uݕ$Z\{e̱u ]=Hj7)jh ?~壼>fFSLËo i4/ FC6RkSn3D P@&3%EILXiIܥr#׊)l]/ۊj t, %Ή(j \޲N8PgxM#'6OϱaGX;Fz~kz#`ZC`y8geU/;WuBmXǺ"5 6뗰KĐ?6ǕdxmzyՁIof4 e #?̷Oʹ:b;#C;_=熷|Ο/qBx=~{ ռV?>y~9>㡍wODnslF}An[y@Qo}PBTB,j3ϩ:[SgYxGɷԛ|qjb+bvBR8Đl_+zcqhZ|F{5z妼#0[~!?~k}~:j5_މu>?Gmr`"]9 YLfypf0A1`r^//]R'Jwjճ!YpctƲ E%/QT2.JS:akXю퀕z=͋/gc5l lۊ?.}||3|}"KtILkQkh2ѹx.94d#2{M mv 1)Zbw9bJuU h]QQh&91hպ ȯH #{o qZ#і#I#:9b |}^ |z96㳍CF6-Q;=7~_[rZRW9h8WUʠbErʉq*&px8)Ϯg=ϟy6m&ၢU((֡"YĔ$^ʘKt]mo#Gr+?/U"@w_pѯZe)R'RޓT_DiEZir35=UOWuW#gIKNClSy¨ٽN #[}bLdN+Ssg ZdLZ$ 6 .׳ۅC2z8kf?j8zJڑݹZeBVjx1I?γ0g`5as<9N %. P1V8V`n_m%%9eb3~ eLnuJs?_%톋yW)RMKihf26C:?׃|5b&0*ݒ|LuwnVlqk~~\X6MEaf Qݑ3~;Oۣ0L{ylHk6 ڻg-oT...T"/) i*CZ/ت/W 7/z|~ L)>_XE;inE ,۝矾fp5O4#wf|))ꓞA0l,P@5X Tc:4n1{{bX Tcj,P@fǍ쥱@5Հ@5@5X4X Tcj,P@55|n>naq15.ԸSb:.#}]#j$RDH5F"HT#j$RDH5F"HT#j$R9G* pVq+#iCoGRJ+Zw*NI_=t2Ju¾n>JA'uVA`m\u2sZJ>fv.ߨXTL P^9@%$P+e28˓6BLd9Se[)-sft7HꇂUip;Pp3`4)Nk|ƃW<1`p4=wG&^<4|o֗>JȊ)7=wt`y1m D7Wn^c?`xwxxTz-ökEo*f]\psqZQ;Wi7hZ): v)tb )L\w V,2:V#[RarA^ tpLlN[#Std.k DF:%, 3pRyo9 huhrYJ{*oTg0B!Nۿ݇ggH%F+ 0&IWPuF)1*Y" ߑbr"h!Q\y0̄k󴡦䡻"4]L?wuC͡Vӓc?Ϯn"d%|O~WvKސcidUNs \)V4OhF!)U:x,SgrAj uad_{VVb\`A"Mteq2Tmd&XW4c_,X _c?.~pYӛiXtx'tn4|Ogp`DP D4WE$y rg0)4duSWElid(^HQ`nxQ K>v,RrHm*#v5qFl?ǵPPv jcC><+ YLޤR80X(Qkd@~BKn*,<~3p8n#"`I/,Ʀ)aX}tbϬ Q9B]n>O*%I.frbM34ჟv]_! e Sr:/gBobl +sm;{4}n̂g?YA3b>EfٹlE.cyҚ^SʽPK=̻n0 #φE$lQwt"yPpERM+"|[{,czZ%e=v/h&rK_״s h+,FE1|/",ds6>Y PbN̤pr h&GNG?{Ƒ@OɮGKMݵIpb5E2J|T/ɤH#ƉWUJkzBnR}!(Czn,$CʘT&H@xFB0~`zs^5ߜkm9J.w(.=sNRQ1/9C?T6sf7v!J<ω&r8Մ9Exx.7ALW_JCk^Zjzcw`pP^ƶ],P9Gy(}ҀSXIѯ`XsX +- qwĈ2+Y2OBPc$(i$:ĀS,%pQHڅhǴ!|dF2 ,Z#ՁH&bZY_/7ghOFo]]ڻlwCkڦq 9!ׯp|WM=22EERTN zfidδ{& ig{c,:~[mMr &NR0YCy"g/Y߁IZ`u,)eVgB%8%[,à~N44i{dui!q]FV$yy{=rE]CdVJ|i^`fĔ -!q^j!ETz#;x^)DsVLzgBr4 9%.)-4S 47Ux"-J&ijb (w[ ! F%ā,*#Tɖeklx81d|SWXYton6]>櫶"ݝzg33޼ iUn#]^?>e:[غn]N[?=NZ͚j=lb[3nݼۇo|#m-|ގyfOo}M]l{^^̧lqo\]|^]At˻m?w_hq<4iPs|TPL4HLZE`3f*]WT3iZJ\9|_ӵ}i0I&%ܕ`, dgQZHV?eu(k-z:ut?񁁮 -Ät3l\4LbgbgR@w)6\zqLzZzUWQWabB]G+ѩmr< u8uU2O +9z*_U&ذQWH.%PU*S SWQ]y V&~^wNe/*;neA5. DM7j: ~..J=`p!a?r;2@H)'y~*_Uro%zD>͔jNɮn>Fl>F#mv<ٞLSY@_nt*mce4U R g|y`NE*"U~Z]<_MB7> h\ƣAj Σy SF]ergjwu~kTW$=(u?lK]erLb7":uԕ egMY@u1*%~Y|o}\߂~kՈ%^h>udn:t1}LAS^S =?l<vs)F76~_QȎzo^h(S[vKK}YksytR3nP;IS}dW3QϤ ݼŅ<+ƾcŻFuB=U&j&J"޽)ɴi@G=8{FSJl`ۏ[oIcG?NG+YX!+ǿW4^C|\ɫG'Gw?Ǵ!װt^W*Q xO;c嘮2 ky!?O?(~_~M6F&tT4ͼ=blA}֥M%ITY0͢>|Bퟷt)04IBU(W[3lwhty{sI-׼ܿ9zo'6]g̟l;Gwox+}8Wwc[]dy)vc9L*fݿZ|U&㇣mr~沑Hnzm m[FPRA&d2RIJHZU`QlmW59ۣ+ȴ<u3j4 2Jp )EX)QCYdLy.4JQ&3|=(Mƃjǹ2D&L M)4EΆetR/2~.X;=+4:JKY,Q(r!`o m(O2@fb|y {fC?Y$.gJFP JV)*)6 q,JD(Qr%a [+{Acf(d}75.V#Ke g)0fEX)&5gpDE+6BK,Ij Qcau\iAځeŅ0^"qDx8,U⏝ +HƮ . #c"U1 ;BX qlSbg!g>\e]?azΊsޠ9n⛢TVAޞ=BȬaJ>=v;MazV<3}>Izz^ 3R(&H7S='ٹT)oO9G7,AA-6tS8։t=5aΊ_Yvެ CEq;UMYq5iysLqQw?YeS?4O?l'vh"lQ)=wO?#$?gZ:B/ N>TAUFg/EF 9 N?U/]渚<}Sξ_K-8j=!HtByt\N.ӸҚpg1k8L7b1~ڀB4=GlxydkCQ&Ǟ>ē~9}}{ҧ}10ǔ2>T T#CN3ILr~<۫EkL}`>zpiGȰ98y@CfJo,6[rpͣ ˓"sKR{hON~@vS,f[K߾mm>)pʁ@`-QY ɛiLhCmMpJ;.x4"*=:aIعBtG%4*R:&@䉕E/39,$ Nj6[hޫtwRn惯so9MǡKO@zRP'KgQN~Z^Z-IԕH]qfKI SS4޵47RCv&fvcb|=ݒ}AIehhN>`u}R)ȂG4~}[b*|lL?gȦ_-<dmKz:&iSKUY H!{oZMK>jٚʠ908an5_|~Ibms1/:8Lp~Xُه h6x'ȵȚÊJcui}Ƣ*4`7XU<)oݠ1;k3Ԏ"MoMƠS;BuhSؗC49:?Uѫ֫wgHQISfΑrf2dVDfCy[TOWOPQwQ8i=|_t!f{>y>Z׶'BPx, ^4&-tMMo,9B8G#5#?#<ߗ^X+1 2& !XlQWޙةʀP> h@ S(JEKEbaΊMÊZ>47j]CV* 椢f`c.:&r "1P؃|bA:TA6{0ٙc"8$ i["IlsDYbFtmj;gd&r Ve cnfkKJ88eJdqOab6D=ך_lGE==kӍ^Qb_(X`X[&æ;~ߩIxpMj_Tդ[jGTDOUMv0cWq84ۡ9xvm۳M 0{m1' 3\R5d֛R؇R AxBP'yQ84K63&Zǫy6Қ YDqZʚV%d=+YF (g*ez~ GSRՠ,%s RQ JD6BZ;ƇѴ -z)I{qFIdhŅJŒS2Ws ؁9sd}Ƶlɵ(*2d) (rf`F^#j#5'XPJEWPA `2*jbe9 [1vi̫K؇kOW82Ok|> p'2kgQu>+J/oWsx.v;Z DSxp($g'D/n/8K]7K]7 (B#ZKb2[/Q%EbUbI7Kؽ&dM-))c֘BTZ*f0Ǥ1 hTK C {s n]Nθ)1x&zp,_ʚt+>'n[PΌ90km3«xbgAa ,0Z;S:SpL}vt0FSa0ՇcU NU\oe.5(VzBjRփ^֭w/bqӛ6]y6WJAZ^gCϷw!Y?策rOOc>fEtG=78sxyqUy=Ʈ6L6ǯj>9/l&|sPߖhyI7g(֧{/?",/}5ބEe1}+ViQLJʟ~8S/Q?oj_%I_CCi(`!XT> ;>߮&kE<H|W27a'I[6q/rk{]Rv\cP*yE@zSwr@bCGt&Mx弊;v8TʃڦCDXaβ>+UgtB_sL9Vj$4xҳ)&5.̘| ]+a [VfP1vGP1#Svճ&C6.(- hᕩ`}P։t U*œs&lg=&ΑzҫcR3sz } Uj1gP PbhDU13 Z& rLIO*S>,c/5g@LjIDE}*;uͪSyZݠVGPO4YI f _j=PM1XDdK08?vSNلmћFワ/}Xw^K";ؽ%]wt"֟q@2{@ImLA4jc`CNQ9`Fs1)@.huh=7Bgʻ۰tiW=wmn_Żk;1 /ky+;'{5ⷋ/;0V*rw=fopgh U\<\o/uqk]sN U  凲`zMqn`T\s2~Q?zp!U5 eDYl6ek{ -N{ߟәh&oȽo]0OUZ\6`-^~29 &‰pN`UoST3mʭp1,TԆl$) !%W hs`)-PsjZ0)B%)+1ȏ&Αi_~6ē 7B8`iv]Co.YZl1='0<uvhXTySBb"ɏ.!*΂A'(\ 1 LAcg 01qQJP-,@Lβ(UMwE7qk8'F\XwE#<{5q{اX+&KYBP6:ɿ{{huN۪ORq2#BUTU(B>TKN pa^ dz8go)|! CTd]i2tm∞h&BS%Mľ|HD8Q-(k&f6\OVTmM/+_->NR4AO:A4mZ} O?|rR4 ~XŲ!0pxxt,-^ړ"#f۽/.tCO޵#Eȗ;l|_av8`|A-D|l'},)j=,ʒpYMU_U"<N1 &V/3d3GY5tA7P%ӷTcN;qu^^8؉,)T=r'pfgr ")/v iB_Q1$Rjdr9y-x2d*|xMw;sEQ{7t[YxĵJA S'Ns|.8 ?".{M N}*k~3"UsJ B I5R o2x"č7SD<Շ%|gsfѼtn{~Kbu>;76}>/prT)*-Q JY*S!1B>x!t$!˫{ G05M 䍈)(M$QrG63ˍ hg TAB5P! E+Tq&99Ck!9,Y֗#g+6gI ic/>LP7{E]6VlӪ`K Zg8q;IP S$ IH"1I8I$2OixGa%O3OoB'zZEO'yޚ䑔C5\;É@-?\9AVR8>)hnP)Btu"quf>\f@whVƠuOL{bԀo P&1:/OKbzuUh}CVvuRTڲb<YH_*!3768}b O4jg>PRBY6%ITY0fDj禥sqLf:sxT(Lƣg= [dcu=qnxrlz%i%A5o_#q&z˽k1 } 1\ÉqPvf2GWלG|njAm7PVYA)<ָq[GI5zoHtf{hѧfYߨo5<'JrEӓ Cᨷ)tU:;7&)mXk% TFx e"(TcB{B@^ށ>]W8;RV;}\loɞob~nb9'H\P/xYSJ9YɷѨ]v3e'1R;bH+軋H>ADrKOT%$yqbz &OPPT 801+}(¢5 *K+ }xݖY-n0+ߑ7~z.3Gs̼]9:!L'~#R'M0їZ{usMBqz70;ԐeojOpc(sb*VxJ.:jFvjW+x%:LOyᐅ %(Q)hdrJ\RZh5@Pknhыq(EbhA ruPHjDy &gQrL)FΞL6x2~ὛuBeǹݨ]dj-=:КjTgVcvG)ףig<8;AikDn%mplVv_g6|zv[WaZgι]v~SyK-h@f޽Wس|GM 'K奷Ү9y_1/]r}w㮆 \<7!{"-B_msS%Ri~=wQf&fϟ?G k^i8+3\ί?OS*|Utb14mؔթ%Dd$, Ԉ sME"_|ݏ=¿>@^;Ї̌&g=Nl{;ô7K7ip @y Pͪ, {g,,(VrlhB9>Sz4l(V!\J)={R X%JH+q\Ra!ȍȬc<"QyMlbR+* :r tOŘb p` v#g% 4vWxRb)dB>k Y]K=/}>[?>f-FT~빕\;ʇ⢣}wxj,!:I;ІBE-Y@g]8f;-ƹLw|v$ZjqgW9*jg*RlIY/-LBq#G!a a:*eX%#Z`R9{YW_FB O .Q5KN*j頳:;iklG HOH#D*x O逺 FShwศb{1L1n;JޭUT(xIsJB"( ̙"Fi/w9 TY#K;n+ADZГKab[g 3<!bB.#Z!l4seCMC.H&%(J#ӱG9X(c %pGL,w4}gNI>XIU7Y_A& jE"Ñ怄t'ɍ I';#Nzh\Ctf@&^w?Kr "'ckC0fNNϣ?'iQ7:\`/8OQ+NR-L^sѣ>WY~gxafy܁ %9sRmЦ*;Ŵq=xtdHk/z?S?Տ|d&|?KoFbm׏'Wfp~ŅݸPKVlZ hmuv`I p_J'3*ϠR!_^i%V";/✻ !U9M3QoCڸzjU괬Nکemvҵt ƯLZ>l;Mk%Oiq-Oe rI g{h|yn%?e9CLNOKIL`QwVKu)dw {Gȳ%)$$FL@LsmN+51WbrLg&D~Wk[1/[afh5{안br%<22EER;-$J!@8S  1ݛXj#jkCC/it*Ř[dIiX1rvsF= qڕSgQFm)fβO_ q=-ʊsu@D墶SSi@E%>K|ĕGYy gUviH˹H֢2 U 0^&$B[l&a4Lnk] ]D䖌/\VΆ[*KH@(m*uvvt.H*JFyB90r>oa~0ʃԔ\V< Rva^z-ō2t[vK<9TE0(}J;RvNonk%b8Սkf}cӗo?~ۗ&SiE\PʂeTQE QRgiu 'y/_^qN0|?!njd'oDLJ4GM>؀:,7&hn  qCqhr 5hf Jq&99Ck!9,Y֗U#g[n$\גK0AM<|t[-;z[^j>vK㾰#Nq')*1tD0ɰIx & 'D 9 ()#xFMDO$[<2r\3Eqd%ōh/yᣑJ,"+D'yH[?z-D'Wg#edy)'mH "#}0-l6x `Q-sM W !u#QTӢcDMkz^1a(82'3Zi25 `FG/׎Cbzq L2mbg?J(YzG)E-A1#,/♁6if-,b@FBΧIGl4(v!EG#a(yw$@:Nj]zx0s5P<)cf>eFU19VͽB薽B̈`494e^P;wXiTzkZ&9x>X0oN߹jZA֍J=^u1/>Ul2W</̯&eWFSh=̏+x xa# A*7VPIuݼk!@Zzo$( Ph͗91ݜoT5<)0IS>>) ~ލ.OgS3ߟgCi~rƚwG]ߪa6h.F5{ Wg ri?U89 g]xF%nVa:>C70O*Tq^~O**"r: 9{)R8]'ߠV>0>o3o1UD¦n>*^_1(wu+gZXWF8nT cn.R,>@t h^3 4{t9Um,|TG P+"4zm,KBܣy44\Oz%V&,Z'&!BB/S`Hu_1n1ڸ%[sogzq~zW  />ʇw \2 *?TZM*;FˉBlu~%JiX3Ԡ6챔>Ji ]Ls>K-=ĝdjde#,GX=- oy7:a_\]dM#Oț]?@3[.fj{ӏRi+LNoYw5?j晨ZqVdImdԵ&*Mដ'a dԉDThjәu9C)\Vep%c]H g1MńeK:2 r763s *V)"D1td-Q qKi{,ajxpp<0Չk-qYܪdB)D묗|T1DT@7xybㅵ)sg^U_E+0  m~HhFS#!#2sS@9L}Ϭ[Q +,Z1VSPrFO48Ir Ē"4q&i8Rښa$cp BB>H5!0*-ē62H"P)~<2RYF) "F' g-j%u I 6}0)tH_kςqr"d Mnsj`f|>NkL]Kv~uu=YU")>k@(m fUnpնX\\Tko /w`NB#6= ~zGBax (I^*OblYfeâk@9Ls&tmޅ*CO#\:]h:̂dH ONo5!ujQnR-ehDZr5X H}4g*̦D%k2DqB {F%vrINQ4p'W'7غ[6(9R\; Vy6TjE7^B랔2.Zl2ZdxҵE{G_I}D4' )cY|Pӿ)Aܾ㕟)_Dg\ԣ;;\K$X$$_boPvݑ6t=_s wq?yզ>ܮXluNmF ~⊛m9&X@aHv1i,AmCK9u1ADGC[@+Ŧn%Q4LoS NEAo *̯;Z ދĎA̔ z=zc{ș?C 1yb %RN7^/[I ׆(S TܐV0x !=~ۆA2ѓ" d[%J<)nKVy0&h#X$Ge|vX'bp#Cǁ6oU5N[ڬ 5W@I"PQ#Dt& Z֑y0o=^Ӵ훷|pOZ 3֬G=7dߠ!o#VM{]/'6 >oyCG&F:ZΑc։\ǐ@ 4 .CtR+e< ,m=DG#,Jrk)>:mU{4כ@7PW ?-'|UZ}y|{B> Ɓ|e~+swtgqtp:HqK㇖Ut=fvsyU4x}kLᕵO|M7<vDTGxQޔ>9#m4?jNBGK52!("iMRpjj` PTsTR2$;x:[Y|bj|HbT\,].Q.ɣ.h֎GCBG[e|KRCY?>bq(pj{atꋵ{V#ZمQ1[)X*tB]M@N`]dHpQזPK!S դXv5Մ TNh9euVD'=RăFȨiK1#t*\l.df^rJai/c#dHŠp1t8:^&pu=̧^EQn,JNi`@>6Gs &-XҔ>@ GN9N`*Y"ll.PILʀZfd.HK9TDd&b\wzDn^ @ {67Cn.aF'e]j,PCb3_jOɗ:a6*>q;D$+yVWxV QmQ5u#ٿl@Uyoj_vjM*OKe∴}DɤHKW)!8.8hF2ce=V-| u>~1->rd'ӵ ǧY]v_| 8"vHuEd .yćxbw6qߨ4SɂrA_` x!'հZdfrQJ:0r (Eb!XOZc*Cfv8u; 퓐=1[ƻ)l&nG.Nt9g$ؠ3A opZt:zܿIa8JTE<@FwhdC;b\Eݦ٣M]/3zUKɡpDlL쿔 VERY&"B++ܽK@PB]`фu QR#aeT¡qy]y:h+s6yMYwK`m/#9y|f b^}&j%88 U6X"j5wEteV.ZtdV_̪FY}@ FTZJZZIM5+DFPb<tUE8pUnUB W0\i㜻W+ vuCi͵P5wGWoO|w$v$#iNo|6oi2ϋdǷ";E;O IioѾXDB :zZ> 0 .]UxjyvrE7M 3ALBC[<Ij"IV]je~d)2ë}e6[+$aQ#'"TȌ46pc{ʆRQag l`fD#D BI h ʆmoFI9R!8@ 'g 3A`lf`6<9;cK&zsiNǾ?D<(H?Oom}7Ք!stϺjֳ}/-]/fݻ>>m|k;yiӚٟ<&򶻻~[6Rwuǭy뻻9n݋+p;unz|x{ C+-u;ts˫w9\.cng[:^]O<ٰ[w}e]uwmmr91|o^0Yg-BzT[2I, oP*]ʀ F>H0UE)-`!k!d6hJ0?NJ+3:4*`!89it{ğbTsuʂ &37I1*HJ%}r6$A1H٧F.n/=ݘ_LyನQLg+gQ^O\$`oGL) !URPKVYZzbWmím.2nO"u>7-pw)f *ZJT֥0& )d9,1 Šۂ=W`*lcrOPbXXA?Mhs7&Wa~HԄeBBK6ydX{#>}^xn|oD.&2@bKVC +2HH$N*fflGUyuO~] wG3KyDeG[×X},q#J2[^(0$re ٢bpWS)"c`S>P%7MeឫDoד|]u4J}@EGOh=$z{tt8ǟ=ozwxƪߺº74{Pjw8ٟnn|a^fOe<6] 5Q[lLL R=B/?Y%99D $N):tΧI/)JDpQkէV _ e5e=F]mCZτ ^rrZط8w5#sKw6UFD]j)g;L 룜Gc=أκ-h_T=A& PLdF 1(KJXi5#^v͈qzMӾnJAr|kZCh+ӿN&OW6IK”`-5 sYl1 iS9 $E*Q@  )E$HuI:`aHJ%sJҽ݉PӴ^;V0e9pяWmQ_n_2elbzЮ5vSVh^O9¡z~Jmbӱp$F1)K wn-;챊:3 DړIUl!3ttȴ$mJ@T&cl>UK7e<'em?/vlKBkU$3*d,X!m*bl7S_e(GɎ"£D:Olq<O\RBd PDBQ.ahnAIKTAR9!G+^NǓ?nfjZ"  &XQR'S,Q+4F3L9z6y^Z1 eR10 )UR]*+8T0ͨEC3ӰeUݍ/ ī/ ~a\(dЄ,&N6ISchDcIU]2h )94rP"Vf!rһLp+\BVp=ͱvcм 7!M ,f-xN).bv~f},VA?Ti>;rs0sx<:ys\EԊ ntUQe*l6pM̷} +؎)s4wGf0ۑ  TRmm\{EޛIk{@90ZmfOuli7,aBjgv'>9qpw?͖9љ;$jFl(MۏY.XSntfx_7Yfoa_n32L&*6^ׂhUy-$xsI.?%nVa:>C0/*Tq^~O**"r: 9{)R<]'#'*(fg= S/kui(~|[0լۡis̫cIWp>ګ7;_;ʪ&upL܎y{l:21M "Y'▉V~~Ľi}뛑(>wu㿠rBOWF8nT cn2.R,>@t h^3/Aji0/rYG P{g 纤^˒A䦁h)(8:+4a:ɝ48zC:r1رfk^易^jM_"^Q8Ty*?ڹ"f6V.]qH/D B4~PiHGk""[{mQ!k0z_TפO oT>^I\vWp6)6x4$hWR^IR&i8{xOzNqu?Gp|r_3˶Yq=Qr_j&iIGM=cP` PTsTR2$;:wKĶ(U>:Qɮr r;@kǀ#!2Ii>%)rD !; ǃ\| /u<4Gaww9y78[06r C3E?0bx5:W z4Q%=ɒȨkM T8-a3$ dvﺧ} RLFS;?9QǝÍCM2'1bɡM ?Z 4Q>piԚ:N/4AxgZ\ո-M1-Cg(ŵ4ݣDfwZިNՕqKofyC)>=t& oH DMg rS3(H(dBzMVÃobhJ9fp+7h/alr\ݵms%ѵiLHAhô:iH єjp>^ w_~l #yuQ${,Ep[N)nB=$UʁKġRo.@O5mHxAvAR!X'}jBPaTZ`'md> ϑD:Rx`So2D$I4`D&`:|*׻A=OUh&֮EA3,'GIZV,E>~n'#9|U50Iu3>f'ϥz]]U|1kK9zR}o~ z( j4TJx_UnpUNj 3io /w`B= 7= ~ze[]~4zX PyT~9ľf'rr j1g!CmޅytՏ#\ź8u'Q;RBc_[wE]sh5c7e4'-Q. 0X@E+kܸ_i18'I$ QQ^vF ST/85 QM&~E )XO]4nF) \qxדR[J(:Yf.v-g2<|ڍѾ׫6)v4&S݃z@vGlj ;xgė+Av얋ٹZzA|誫zΑ`s9s*F>G#!=u]K3H]x=_s we~b{IʸF_x@}T]b98qݮ7[\M,@m$;Ә4!\% mL93PP@蠸=Rl{~#+h-&0ikQZ Hs2IR:J xX`7X=Ax`"koĻjwW4jd搧xɝ%chcoo37RMԲmکLټmڵ:K["_x| C^]욷˄K+gbX8Cz&Wd[avSh}|̓ewUaX Mw^v{{kϼuѫz{ /Bn ypġЫ&݆[| B{eq'heC/.kT]juMVQM`Vw.gXΧM'|G8AG.;ϑ׎)LGRcu1[רfCzzX81?S,iM j}fh)Ķ!oiŜ-υˋHЎRk51#N erqEKs4<#p:(BTN*W8ΏzKJ5^ /Ҝ"4sS -jآ ݧ2b>$&-dۭrϛbAlpj1"}& "xz^Fj0Qp"0-'+384_ ,>dT$CL%f"S񑀷ʰdQ>8p8SP$dF$e+.I#I@<&oya>+Ζ|ATa*ޗC%6#GC%U 3e$M@> d_r韊9yT4%)'T&D3U"H+T͒"jSµ:;eol1D$*$fU#pCUo&.5T$ `("8*vM|!*6#!,N 8 4[H&H½$3I3hyrm<"[z!L1ٕla[2& 1&B́Mk=mFD&kcQ,F8G9X+c{ը厫9/ l]pURCU/ U$r* pFFZ~ªCYډ]@u01u+!-nl?qͥ(%PeӵѰp_͠xDYnSy) ,n&bLqGkvS8?R³$ª{G(%)$$FL@LsmN+ BHݬ>h:= bTmAL1E]%u mmZ?@~mk3Y_'/G䁸#N\OǓ|zw лJ\%y|:/S*#YQX4*bBz@䠨D!3zCEBc"Km`u4[mMrh%NR^r;ϩٲ6g9F䍰fof.lWt+Y[a8=&$; r\Y#ШK2_"zzx>:JSqxW}Z*/̗6 kFL 2#K]D˭7KD:2)% "L(Q~1BNKJ ͔< m^E)D;MM.x뜡2DM΢1ɜexpp%k.t߰҆ 3_yI}IgPӠ{[gmn`!K Юtk|)pX4VEmK3j3ݻx[)ʮ7Z.'J&ZkLR'0!& h'^y8uJ*ZƔ0fe] X۹}8 7i\;tZoVA(ԛ[Gkz7ԞSjrὭejIUlvK<:TI%0}J;RvF]]3"NFuAfx\FGPP:py*uQ G)25 NS%QQH&Hj# l@g47* qCkr3PLprCsT9* Y@o;ࡵe46e)"{3*݁-v;G5(;P^d/fŠR8ŝh)B$ÂO$⁘$$'4\w3.BzUOyޛ摔CD ʏ&WuqNP7΢䅏F (EVNy2\~|-d(Wo^#iy)z'hȉ Z4W*F (@7QI}{R~LBG"B25G-=El /ijm.Xs5΄' љԺP9dMzIU֧`i =9+$1ujj)Z{='atvMҎ>˟=!/߯f [}W4ߞ F^4+iW ǿ B鄉M _}3ϊj<_~,>_q{?_S|ߒ QY3T&(R|^I>.JP4ܪ>>e' QVkJ^}B4ͮ@BPyvz|'<;U>bhVRzBӦn^v PYb(pMO*C܅D'j(C/xkTuU)Tm+?a4?j/GvDސg{Woz'8H~6+j~&z8(y/-SI&OMv^^mhuG+YE=˻;ELK{j EKaj%~,;=כ(,)̏jf}~fg !?q"|q:Jnv_[EP@`~ }FeYTP2-=׽_=tEWS_q}N/~#hse0r mBy5UkL񷼘)ŋanwںT~lM24ox!͓1,24R6dI ۄ_N1Vl2Ot3-Ăģyݍ{[ymt{?Oʙ}wpk5XeQA6i .0OT,bx ^(ɩצie+~zd>Gtւ7xڦ?><5&)mXk%vh9pR%`VhO@HSKJ/sO4ή~c7[㍕rfjԃqUs nzé )~Rb[j@T|[0 PgK )6m_*W]Fܨ9M}*_q_?$ci?9{;$c3&ǼgT+ 6J"(kr.Iq՛d׋4)%URL*re)x\cp0%4#s"tl[5\M9sTpoUtK;uٮd jv]H`tgYg{L&8V*Y ɐRJt@ki[23ǝEI;l-vDŋXKV\sS ;O2,Q˭5lH`rF@EАv,΢%m7H9b^1Rư{P,;;m[mfL C Q3ZƤ gOrMLyWv/qI6^M|S jlbK}|k7őqôPW,Icz)"'H mͽP4Z}ȂRs8W)scx/,e{z)TɃLU٪5ɕzTDDs5.(£cb(d8u6|}a+{Xl/5hpYVJDQ *%$Q0%:}/4#HZ;L .@!C&ē &0qdD ?vԮ =qiS  ڑ̱ )35bAf;x` boqZ,''Ul_*L4oM5,qi|7yMd%jlTgEMO@s*9'0#EvxzcaoJR}C0\9h=lz0 Z> iN(eq_-P YoLZ<۫Ec>¥#" +z@ik~ޖYl_s toz\dnǿce\Os~Fq.sZ|Y.n(F[~M`-QY ɛiLhCmMpJ;.x4"*=:ؔ|="+h{_dҒFe@j@$xC<2qF#uQ<6[h>*x^L?9Gv Ηu| $ڰ`W&tzWׂoALxD9 ))a ?E#BՏ~щ'hvd~$~s1Br < u>N$u[\8-i]˗_9x A4kDDȥ.IĴ*Hq(.nh:k. o+;Go$ψΘ?M}9s`w R~Wo4UT̾AqaT+<]jaY`A KH!st™21TԝC9bvK}ң+62S g/gn]C f4t\`RxFL֞N 1QA!T nVh&תm|+y՘ /jYZ8MT?|E;H4z)"hEi:V5c˳Ư%-Yvܐ83Ǔc(`5D.:IS*Syi%FHUTQ ʰ|JԀhe(*bւJ- ѾfEmMVYagy[tW".Γ(.ѩ87Ʃ_fø+"QqF3ޔ#q>$@#!ԡ\(yF  /ܰxSeKθF!#HL(nJq [bŇڡ]/3ZgQ],|`J%@G 8OPwqI ǤPuYŏ]܆]{Xlt=ܐmc &lv}[-X3Gފzzq6پE0ڔhy5<^NOQ`G}8?j,SZrD V/F㖂ϭvj,z˱Ǔz q:}akO+O tƔg֖$yc'"EQ0bvM^PΛA1k@em5RpK= ;TI~6YB*Ec } 60*!#P2Dw(#2'F5vPk$"y t q0 -T*LI2 J@׾$m_ h)Sۣ[gE(_|6b<+"Lk^ K\c5I]p"p>; D 䞉{Qy9+ CDdASd7@!$88I(Czn,Я2& *HRxZĜS "<-&p% 1S#}z#hqsz]rl}DyŜ|4_N5(XPJx`Y;NKb^r~(l5I8kܦܡM]I"Fa$"LWGM P# xpnLq2v4R(x㩎F&h < + 0SaĹ vvy6<ٴJ織\ű0c )YM?6.gn]ԏȭa/_<8z<7U6E0'*%2j@) QEa*$F}T(_uL{1XX 4`Y<[? b8=xZ 邼PRB6%ITY!!,Zi0_o7..U[U}"wqw4^_'׷|'IEC>|i/ﷴ2t?-q}͋9U dmp@,q`?8,oDŽp?c|׆8ջQ/IIQ"+}LA'PKfdz}=wazVzJBkSz}u=i8k0g5rv=Ko_Κvɸ[el^z^`Muq®ᥟtvz')1[㫼|O_ޝ -o:j Q5U.} zQ>A4$wTةsVڶo夬!U[O)WvBq0ygkjXh` bpEK: -P M7+뽊@p,{V h' [Ҙ@P%UHdE`PXA_Dgד4H߭HEʱ0 kJ>`2K+%ѓ*쳊f)B,Hmi/]t!g]**^`Qbecʸc`SQdyv L;{Y'ZBTXtUoFg5S%0ѸtI | N\Eⴗf!ߌc'~ӟ@]G|1=c_\_y_Q;w.%eFPj4G~+l`ƅ@z1^},<{X}?0(~UlV]MjX^Ni>Tv+h[MӬgv}·y}]5yb1kxن'|ƣyDRI JJ;7^pn消Ըެcg*Kj.) -u: lΛ}<Ar 4m $h6W|#үKQ3i2B74K*(O7%.!gUwh)8Ri+nX a߳U _ftjwSOn' X߮btozTyWZ1XfO ^a͎Eh?2z'_>w4ZF;(D?ɧ%q-@%6Fj0MBB#mʚq]N 8F$ܚ{@+`񀼠A8d#dMqF`cB#\KdE El$uB &J2j&{&)6hyol 7|ڎw nM 68eW¿F3qa/kj$J ҁNhk;i O'|lQzbvW髟"`tˋqyܼ,<Ӽ؞ϦcZͩoua`l;&3 ZLa] {jTU3?f2_KF*AdWt?h#~t4ֱMC.]e(>jeʶX=4r a;OwE TNaV'E٢L1@y%T@|nN7܈ێ7~؀~HB2vNba`yC}opSr =|虵+RĪ2wbLEɢG>H"(H 48Y lF$mzg)܀qc=Ӈϟ;d}`75IbÂ,f$aEdE$]{(ѐFyCG$p.g?ڒu#ϗM7[CQl! 竭^)c5Mϧ|YQ2Pm(, D91l_*"god)C1ino6{4هmǏы$+E(5fOX"Eլs%g[0Z F$ou.Fh**^t>iX ,NkUL*9&bf #\j͆[Qe.CCIh&'_{_ےhxs29jl;[*g{VaߗǿMn@ @0O;wY1Vu$Oʩb m!q(jě--J.9rJ`)׎AJPH͆2*la3bc[-|뮷󫊌E0,dz7.n19VH)$  diDS9$HNP1*K%@3fbN8_xvMITDn6;N'y0IP6V{@wIL>EhA^K 1:ʱS*:Dbx fY0hj Rt(=dF؋N,ۚL"EU#:3"p1N5fm{~EଆP,b3" qwI|R3Z ` 9=լ4HƤ@潫'W$-}aU9&ǃ #fX!X,[fùG[⥎lgR)LJˣ`Lm {;(OT(e}+J֚˞áZ=) ҸsR$aU$ W8xd?92e%͆sҲt|Gֻ6 vkX3UF-eBXO'!!%'F+El[%4p4 MAc}bN.y(<AVr{t vKJL$Zt%{"x gD,rP fiCؠ_sro:oE0_jXȼiţpeaETX5X-;]^i_@מ>kom}l O/^酶tLż>u[Om`!bD Ʌ*yxJ[g־:AQɎa\H8*d@${3"J>2y@BK|d9(& JŞuHF)1_}i I$ιTT5VҺpS!U3mҝf,ݜͧG_2}, 6lBFF l~y}ܯM Eawޑ\T w(tȪdb?wP9BUxpu=5Q huKM&XmSfud" 5l}ojgKRKP>KWk(%ɘ%&#odhcƣ5f>n(ˇTjd ݘϭ^7MgH+`m㇯B!ӎfQ& U~g6v@ЋQomD+14xϸUV.WZ>}/)k˒ßh4{h}CF ~n o)^+KԵ;UmiR4>sJmhr܈/eJDdv$e17OC -4kRc&6H Z"؍/m$Q*0dSXm2ZI)Pihi>0z!R}ip)mDՒ(S@͆sjlC(Ա&%rVx[h;7=p,[:uS|_ЧP 0gQ ReQRfRR$E Z* LsW Z"&[|BS+g*rHO V3(c#Eo* YC}ʠ U@yZ/t \qD7yYSzxHjP1%|Pin}mIcdHl_64*(:86@%֊0hP*Ev_BM~F N.Eɾ`Vٞ7Prm9 CbDV3nov/%_xOLPHT7ؚrSB;.3u+~jҞC@f`hĸa_'gHiQeʖyƶ8:wfxt)v1 Rj'#-;5qsGM̴]T#2{C hGr-XE :ikoG涘qËF=ȩ§[eVB>2*x!XsNLQM1/RI"$$%I`N<~qȉAQQ`Y2 +/ ޼Hhe@BFJ ϖvKN%<@rFϵ,wJÃ&.Y̬>5(SzzMU tDW-Fխء\R$- - ,o _i89ĬW逰 ߜ <^sXekdV[Sv[(tdSC?9KԿ$ʲo"zk^}n~}F5+_n4FO,mg Hm| #/y毀؊6`Wd>o2) A*ZB?݆Q^p"]&Ci& #F炲 פkjE:G9贺>imw1فۯeշ/m:Aߣ0uHx_;C›ƶoN^;ҹz7=t_u}_)VKf)upbDZw 2Ӯk|^9}1Xbsx48&TЌ]xA3zq4cQL84 D0qȧfQL{!rLxe73p(nˆ^F]xd!R&R/5eD)G!eL BzqOki*`j|y(UdѸL$>v9 ko9ż_lk;fcĀ IvFʈM, ZŅ㌡ (XcG4bHhXdVF4*kim 4wFm6*GT KIwPۑ{MY_Oyh[]{ aߧ(Fm1(iݳYY;a96G AFq>#!Ta Gyy=R4bF'➪ RST2Ǡ(7" nc"$w:UbϏ4t1Y}Y6s_~_x?_~[ Xob ۮ,K?^SEΙ&g\lJE/;x U>dYDa$-@M!zBRqE3!yj*D!e!x0k5f,`=6P2[j> /[NÉR7 r4>(5yAJ|5E.{Tϖom? j<.O=[L 2GHWV9d^Ь(T⦓<8YjsS$;a҉g5i]L3"atit^>Ls4;Bt5xY9+nmN>|xiՀ->7\?sbuP#n<+\>rh˜|{kL;5JPmxnoFTZs{Wog < yo0MS+t8Q?>_Ckgt~++fJjG%D&D{uĻ"B!HpF8p#hxQ<;Sz:i Yj8H-DIRm K#Pd84O+'uㆉowzot*~c> x D) OR5x8*O2@F JNGI'Izc]º -jd)h8$0tlREm{v{|˔l /ēW U\\1yll'"&@K7#ڞT;5|5^Uv}~a}?83"Ecл3oo6Q"}oio0gka_nSjM4c(մ!uDAR&Z3ˑ!8yX-@`=di~dNioߣ1s-1Y )@\t.p|0e&髹ӷRs{65-[W:>eĮ$0( ,h6 41q TQٟqW61b)h^b|ŸDn6KRpLS?㫂n7#$5#M(UUGUjlAU0*&ZGM kt*yҀtߚKdRWB }$Ht)<})O%IH𐂗iZirIZd!mީF2$k`j!~ZW,=Q([cR[.d ^<3ާu~):xˏM]YӬsت%ta1k|c,{sŧ'%nw)Hi"Je</0b9/$ށ5ZD$WGq9mЖkr+TpoUt?pW;O=t2Z]X ]o^*/RPQ58uYEˮu[E\PUҒ;TIl1`C1LN:*Qԩ`5D,(eibb+ggl/kR/Y^" :&EjEwZI%HD04V\"dqIʥ>c^`PյO%P0 4Gtbj=ZK KGXȁTRzF)Y&B^'~P#M1|8K~|P{ ˜PvV`ϔσ}[ʳYisCx +z$~3y9k.RW]/.ll׬ 6:ݯE(ؙ!vk)2Qmf宸m>o,` aā46W`ٻ6n%WX~LqqUj֦Rya{H>d:}CQ)PYdKLו6#d!0eɉf:,&^O*vGILX[Omќ RB`[j3uJhX )Z*Nq2}ovUHBW ɷ&jH(Ɣ3քm^U̪4vL\Oa~V pvwqŋxص AX|pV;Q%TpyWAM\VI@8g.eSJٞ8񶣌KE?vq71yu2qYi԰=xJ5#m`vHt2zj`uhsCI:9sDgN[*}sᝎO~=irS:tD-EIyKBeR1_PJpUI^bM1(Y͜{a SR_'˗ ]fRJ*ds$Z*%guBΔdMHk2 QV" PFj GUfL$N2ޣ 9Pswn c7UdWmY㗩Tv}t2nٱaӈɘG;?6'3ޱEl+*]ckmeAkjI&1Z`5%ژؗX$jJ$J5J pٓ+5$]"_+FǥuJ* s72*ݰg싅 qƒ;>vrژ񎂜qv/|\ q<|>TZPtb2DR;&gGP7TW6TВX4 6嫲d83In>TAR`[3WwFn܍\J1'WwLiF*@214t*Ɣء|# Sp&V9#$%K@jt&:vFnc;[g4Kλ͸8⭛Gcv(َ5 IRVjP1\} 5-ݤR OĨ58.dġ LIyH#X]Wqw>+#e: %JM|s"Y{jBBBh*PKyIM8ņTSk\b\L:?{T{/ *DUаPA G2H mH9HZYP8/s'b7s''tp2]gǓ6ٽOD}RVw>#(V'/{J؁(JL~:E;- tFKJi;R~W~- -I( 3FMkT2įgdGBbe Bwq|XWE)9X(<[U*BD F2x1 IcV{2n*)ɻ;㼵jKs'ۥ÷O7]Uwg1y }M>ə)3xW F2Uޣ>nٽzC`Ihe 1 Yy!Z5(cD8i!hR?OX-V)/?zs.O׌@ !UjOc5B j:XQs/`X!|s3]Taj*n}fyhWvL?N:X% ڨ\Ƞm)5)+ x(&2ąd[Rї~|3g3w96B;0OhK);E)JӳEx."4NJs) #NiT:,*":=%?A.3jj f:?-/bJ8j!ZԆjtXEH4QǤM־;\rzb#Y5YF_ L h=vP%„:@{ u3ZۢCe|y_JP>[T8Tn7 9RJZ1'A[T 1g]j6RL78Gn}7.k”PP5 !8!{#qM?A:Ξoܱ%>*c?>8 \,Oq"a;d13U5jySł/doK\aO\ƿZ]a=ʸxy;^\-e=Ehq"k#g#%qn#{n~lMW6/ZϽsD67=K]3z"J~P*˷|2YQkm+0cҀ1ӥBdf}]ϗV?e,Nuͤ\.Wόt]lg4Lu4_&w%f_؅[{ߥac?UU(rI2vB09yڒnHb'.h&'XIH,ZU ĬղJk_bp%I>|06:CmɤK2rzik_O{(Uds} 5IW}־rTiMٚ(AU3f|#xu||RwU#{MjS~sN[;3',G9Zo[R.pIQS~PrFC*_T͜==N4{48LU%'b<֦dcPRlQPhH9N߾+|+䏍pw?{Gn07I!X ;e'"kYR$yb't讹R7!y/Ws} 'nKϝܗ ujiWJ{J):+ZB>iڕ`v ڸu/]DùDsv%͹7fXN5G4nhxG # })="#2FQIK\`"4O:ZKf2t)H̕4ee_(9AdNq43-ߥvOultv/:Z{ o ;|=%f_uOz o=7.^ŗ7V|J \_$pu n>ZHȿ/S?~{7Mn:H9GZcŝ?}UŝR>n:tn}|]yxۭ󸗗>qۼ_4{~duid&鶫-?OÉcwq9 'o#i(qOC.b`g2w=GsMr/,|aE/ݪ~>?%ϖpJc~s7n땕=~?s:7}ɣ/jB+6@&!O{Yr׳`4f_|Lv<;Wyxt+L%Y2n ~-[sZF|ŠsiG kّH.ͥEYٟe^jZ^NךM Eh(3?{߰>2V[We5R|wA1_.G.ng}5]UC{ѻfU+ _;?@ [,r]ã*OY0EbHav鬓4ͮ~>o37\7j/ϩ ?ۍp 98P LJZJ)>l

5ʕ&+-iǻ\ȕi@ʑT<+fYz9pN+M|4Z+?L*6 nZ;S( O֯r\ی\ f v+}a!JpmEJ(m8é)[B%miS|KA0_]ɴjEvBW(Z*1jK׊\ },MrurEEK ط3f' V˕Pɻ:D-Kév{8Հ f3éezsSțCS##ˎFT<N{}&Fni6%h3&: ת&;mYtXW=a*#K0\ߕi~vH9Mrܪ'xl+.BVJhֳ/+\{Ñ+h] 6c+ϦG/2潟wlѰJJnآA5L[<[nHcj(v"JDZ#0E414^lU+ -J(m+e+պZc+twureQ5$Wqw(WkC+r%]2Nrur kg7SIG;oo)ېL pCcv-tB$(3/WԎ\ .73Sh\ eQqCrl~ӌ\ oƻ-x x\EZzŀ_v^1t38vI[V>RAELT?.Wp'p {Zsw%nvlYHkaaz; ׁ( +3s;+hZ+z˕PIQV! zoWr%A"WZR~r%Lrurƙf/6rI-,24oZ@ΩKZS͜6zY;YȏTZO5qR Ƌ_a{]E4kt+Z{D#c; wh^%1FEv`f a!$W W|hA':`o]RI3[nS1hU Y[{J8?ǕQONj 6QY֠wJ,kԽAOݗEgS,*z_yS>w.>^@z/?-׋we:_]_Xnzgslm&1[dxVH O[<9Mayۇ5P|ѤooPc}ܗ{V 6jgRn{,S|{J0UoXWc!e_ng#*#HQAQ>^0( pAxg6{'*x8yk>On(Kb 9zu9֤9KhJ.,Ψ\)G \G0 ˓|W'tB îe e-ҵ/Z5G6PQR?rrTޕ5كF]|0˖FaP:Q%fO(BJTqp~)碻bto^>uGaSa.yJa_{U`*uXxFy;tU&r-7r t`KZ_k 9PL.*#s%l{2}M6;H &FC ;\Mv%pdo\to+ɚ OVVTcDi{*u [z 8ƌjvkEɝ1qǝ!]NI9_^E#᥾\v_ˆ,S0h,q΅< /eXX.{TFo' !wVeM*}|'yvS~BhTFVՑW5tI]ӹ=KTP۪719\)' ޾RG۷bإvlMHIj] m::~N??o*TDD>]*;lhHVŰɰqpJ6z.'DUx#,92;;u&*>PU\U>Em>ƗؑBM@1Ke#iɒ^ =UBRѻL=LZ%2[x4k&`ER}Fߢ)KhxQ£ zSN|z|hn8;5OZ,<تu)J<,@d"φ8Fꕉ˛`.N@<.-5TˆbPų 3BM xN^nU;Yް>g%/ц hK޻ZAq W&Stm}!1]('Xz[2; J+@:( u*:Q Hw @8*Jԛ OS.t*S1Ϊ:d`vћ !J6Zm֣g2'p$XAE%tSQ*]2j` wZ| = dߣBXGS()g^\FW {?|SAQT3z0kQS ˙޵qcٿ"~0&,F-'A[VݲeKn 8VI**^{.A"p9  e5-/:PҰ*T]FՊX!s=&236' _nbED1LDr|T1&PT6zIڡ;IBM`ǀY;O8rUo:ҫkZ/k*+=[= >Hh!Dz7&@uP< andQt4ɕ!$ B˅)IbM=A!%DB>=A ѡB-$pz=u $$]LAQXRY9P^$iǂ@K^/B!K !=f NZX<83 A;1vwEQLIF,hw9&HTy Z(o*"*jQf]TA$G8*g@ A169mZP ?h>VA,ng%MGhTf9Ei*5AZzVDUN ܭE($`F}>ZR0=K mrn{Zt^^wg<ͺ1z1uT$K`nvJ''f= k` N"i5eEZSR 98O)kH w$fctpW[Fз<̈CR@c8)QC^"HmK}A6G=ʍZ{,-h)r]gd*2(eԠʱG =>A(AzGZT}zT [bc-X@|]D4cA\$\u; 9,RAyψfèI==Pʢ#6}5u#1;t ;yT=$Xn@ QŠA19UAk蹽0= b =|RUzfRLFAL\Qha{jڅ9Ok'/wY%aǠP!DlaJCkU|Aû  JNàe J F̀|1$zʤ蹐Д n w8QTkJ’Q[1 q7XoC1JWKC$` Pd*nBpJN-, 弭T'3tE#BPޙ#'~AA(7Ջ&B3sqb 8 hZ:hStVW6_P|ld .:#:_Wioίb<8ZR'M9@L'wshFjbd}I@z]:_]~ܴN0ћNsr3ZZN4B4_9.nW_J3fӽ7/{:L'?[ۇƩ~ K]M^5qj2aGS ;iwm'BOpUXm l#Y*׌hM8y'P:Nsta'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v|@(l !'X 8T'JÛ2(*5; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@B Iגh AD3NHO . @Fv@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; dh D};{Zqm'":K'QKv@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; z'{֫&4Քz^_'M \w,'yq;_} 2S[Uk\Ҕ1.mǸޝq (oatƥ ]`\<] ]Pjtut孎beK<*yBgˆ-; BĮLANZ=z&Ͼb( _#;&JMxQ;hngiAC4Mj nԭ4jyvxBLCxdCtAfJ ]ZO8U ]pܨ)MNW@e:CepKtAfJ ]Z-OLW2Olzs)lPt(屋*KW{HCiѕك Շ6tZ(]"JBWS?x"0]!])|T>Yy?1jAsKܨ$&9aB~&˺ˮ$P׼sVeG~Dk_ `,e3_L?]zWwoŻPˇ4}7BB+I9wfd'3,~8)iIχʟ4? ;I(ir<]5DWd;tEphNWY3+Z(Xvf '? Z3+w9_v%H55z*åI\9_{2hv7 MZO LgH>c[+36EWW6CW֞tDRi|~1ݳ?#֌Ufܱ<:/{<7}OI?T%ݎqAjq;+ZM^{whdOͫ;rquqqh)zS\R0| 646?4Fq4FˁBt ]Yket ++LZN4 tKtQfj ]c5cWgIWQ-)KFr"~*%r*^ilGU:h~d? c>GκL +B{5yC9Utsl{8"dP5'?fG(o_ / l`%ı= #Ӣ۞EztB 'w.~KiBze漯*x9ek=^bUF@(K(Y$<~/δc=$I͢FT oiV?U/tߥHLovQ~Wbf3rxDV}vDRngwwx6}iu^]j%NI+R UESzUJ}!;6 _^6 MlzxiYR__-:IY7eV~ތ㝺1>zmu*R(M0[W{d8.5{Vlɦ/t;\s Qo~vWosdw_{y5׋'{IM7y;!ί*߸o#lu~&8=SN7Wŋ-?ܝ>(t2Sd AP( 9[ _pD9bV@g[iDZ܇|*.AyB465#T-ozo< p1/꼮%b6Fg}J;RH+69p` _Υj(/\H(OU GJMOH!&B˨kE酀Qƪ=k /t؎4jzzmS 4S(%9+D`G꣪$xHExbL\}d^6$˚fOLUUWW‹$-̵!(00K/3txː,/.D/p<*q8ya>OsK9?ew]PN]H|Cy>eniK97 ɢ?!ru78tO]7Qt-65fGmÍ]z~Ź'nR.(KƷ3ļ/}Ґgs#2{禤 H%QiTb6'jp=Mw -XU9f{pK4]|wOl6݇ o6A^XYrqҴY)42Brbi( @!!DU @ܾ7Z=bAk g?s+2W\(k6m8eŏtbe*œ.ҀI"xcʁ9E ȼ5`}RX^wSP@s@s/Tnv>TӉGM}]O(-GrK ˜ޒ==I<FC9j'Je2=o Y |<mXI 2=Q4~yA[/%0xCE`-Lvt׾${vt%;G̎6FO[뭿m>;WHJA&ÔiLpLh)YSG鎄ӌܨg'+=7#^'{}7'emr%Bxd۞҇2g$ |L͕ @%K14&H)ArLWeu\C*:ey,`6j-^J sY쀻 *jfuɋjuAć.4e3rziQs^LWqwiSɱ~\ҫ BX=AZѬ۴W|4} O6/oHK<t|[h'9HZ" #&'M͵^x1*ƻ$}]uz~9.^~]7_Agnge+0{%: lJ/&:ls? ,il㸛˖p3Nz;r8_PwHǿӨ\ "v&!:J$활 e XJk9jM{å޶ޗ|6v)Ҵy}a1wy7x.T]o+^t P A0e af9zŽSgrAj ԅ0h=+o%zR0z]&gp:Eŕt#LNg&aXT4T_}!\|Y}Z2ǂwp/.va{6Ghиhy4lqArh8: \D15L!4걥1`{!Eqt+ZXL혱Y% rHm*{j~27{jiG_6^xlH,>7)T.&5żƲaXR'j>38;a#73'#`7ɪ̉\c$FQ9\;8/Չ'"1'G)n\ˏ!-W<{9~_N>fQu\k`Hv2nyzJ#m@EY@YT>yn V]`ZOUB`*j6;4d8rS3ASZ`uK\.ՠ/P7 +,Hx9z9 úB-Ki]^?Ġ ۽FjݍnwOstf?e uH-j*$p|7=?4>2~D-l2~.0ct tsKw3wƨUs5UcSv676ߓ/a:W1 7%nq.nz7Dgðh EP:31j&0:.H/\\0}/k~_~Slyl \^@˨ݙBfHYrcrYFm%+@$gd&2q.3w1GXN+XuK;RC䷒n<|y =x/A9sr V(%c \0+f4g" *z5x'ys :?U:HY1ePW0Nah"e6*E7h@iJ:H#x%O]*mO-ϲI$s2s)N*.YF(uy!#cl!:sݪ+렌<*P+$ݟ1+Dw##xpJKNP{5q8|h[i~f|KN=~p)/wT&gU L_Y0Flcy:%8 b2|6wj<%ÿ ٵPE(Պ}T# ˾ 2FGUFUPŅ"'n$h&2+L2H&%m&y++Y5qή &)*]K1RHۜHѓE 둗)C>1&鷢KG}~e='CY`Y6;C`"dZ讈sVAf ejUZI O!r4'a9#K =RBD?̴=w6lۑWi2 {dQ@ITBPl[̓wEHB$90|jVykL!tFc7R=cp]SV) 1=BFňr ޲$f5Z*aK8Ff¸NۚnMGSk-醕oNJ>2*X!XsNLQM1/RL S` HE!ݩGs c1U*ll5e{'v z3I3.H^<q\H(s@&,o.0}tw}K c 6YGH(򹖜N)bxЄ΋9ysߗҳeXb mk;ĬW逰  <^sXe]GL*޳pz\lύnttRCGefSx.|w><&9|pt7\ñYU^>~o wM>]ņЮB=krsxv@JTa4KHoQN)93:E#dǣ^Zj/j~\X֓,kH~ -#]öcZbZBo'{0LT2V[x 3g19#~"NJnuDGQLRgtV(&TZi"DQWVy#<XEH4u oR.cei!rL%iۋfw6ac-;E **$ )#6 `3Lbш!E`Yݬ ӘG0 4A0 62(C/*GT KIwPP^Y@SƸaW&pS΍6,oz,eY u*%mM=^cwf@Wɬl^SvJ 7Y!Ta#.ˮH/(R`1#S1:1(ʍt"$wZSUW=~%]>ܐ#ϼ_Zm,ww|ta,b4޼/Y~seKr/MDPix`9Q*g er($ŤcldK܉FxgytF(%#erǡ2XP Pi"'se8f2ٛKفũF{qe{< &b&7E1z9^ᕛ?y8wW1ڽ7O;:;bMRD ؍ 2^5@" ̱W!g&y* l U;8hǣ`l4ϋ[)@=+iXP;3 B=_ۋEDP$0J9%H vAtF(VG@Q:D "||;`8`*@EibrGEAbgA`#rx[J +,YydOLI X[bi1g;JaظR6&N+4[nwެ vL];Hͭnxbs3IKlvx0DaaVJV`0TBdJ^GM("ґ[(ipn퀧!kC Yj8S-TIRm K#Rd8ItݸAMt89Z23PyFcpV08<3jqT('ek0N1)=;/>XuG!ڊ5mƒi!'.Z]\b2r8SΜ$= 4vE,`=M)r!v0=|u f}ClA0;3&4]_?&=]:ŏ?L 6{ۓq?f4S1|H)rdLe69;}~%QAa09d{ơ]#s8Ǜ޾ ε8g$]r{0y<^i˾'[ӻUrZYmo'>mXB\FE@f#H/s +Pf`r.RTm_Kd]7Z-7`JŇ/"R/;#uZyЪZתVgV5E:A qݚ5ڵyRy9E槣¡*!*9 RIKqkd]Z]߶^Ӷ˶\y<)ZOsM,ҺQ5!^nufɋpfAE kʪ43ާy~s E{)btKN @/²V&̆T*cָO( IV{:?>|+q_Wu\ ^%56F뿂.Wj(sl~~^%'he2 WsM3Ͱ ;}2o =PU' l~wspDH`c"$PE^j*ͫ4򩧷"3-&lID2ˆeE[f/j*WI&F]~ʙfS'`.9#:eb*i> &}SiD'IH[ILY;;F'Gs$_QC/p89T!m$2X*5 #iHYPKxm85\9ĸЬwjt"?"+VåT6 £Ǫy`(I]bSZRb~=0ÐK-c.[DUb9jk83,5zHs5Ǐ+zK4AZ"⭁$.m$-?juIJ::AS@\X[*I+U;j;\KJW[ ,ȡvFNZ} v\-Jupc-+UUPj;)`PUoDWжUPWIiWWrdW7xe&fYA76mɛ?yrdF(f[`Q w @08\z)70l C)d/C &| oʐ5VxeJyoϐV93D9QE Jy$V9?A\Y[$-~~&)9?,-Y̅),j$9:`im.h9F=$,Ҩβ߄"l6X[I8y@k3̋Ta; 4PQeM&lck~Y/{$bx1SVT.4{3 ̏)7lʞtŅABe4E}gNm QV+ǧaL؄S0࠭7lb# $g\Kr1| !Z~vGH&dx4c2V $7Iv~*Ӟ=-ڟ" o{‚ڡD5Q?|n}tZy?ZVdlach>s}g~Maٗ6foS)u*IgCPR(F)K(&e~hE$rk_w 1j(IJrUr$'"Lz薕\KȆC eHpSB0`Dqb!mR%I(g81(3:а4F~YAZI9Id :XBEFm @Vl,Eك{$&=|CB5W.`4Xyelf9QDDk3]b!!š}v" t^T-W5cTeæs1; sk%C i9 SbB-Oy%63Rd.Z֦T96܉K%KVw|gCii._?n #۷C;hwHbhz^)^8? >]ԃpԟt' HKK/Se{z?Qg2IIra.΀k9&jo$|[}E1wz^4oD#E*ܧŝuiH9\34{J%{_x&ei8)syN|qe 4D<DB|I&8#{ _i_tCsFࣝtxv+<x=m͢ B$0'fS$z,'k!(Nw\{BFddY(' dW9 Y(X;km"Cц,v4F~8P f#1!x5&Nvc]6wWTc niew.tDž"M]l*ց/i ^)8_shjB[a?\@8b59M0ᴔH"x;WjU%#PZ#8G#pVƳ6صۻc8ʺPa4nKc-{fV`jbZz"5)ƛ~b 4(Ø<B`mk>G78ГWZ܀,"N$: T,͚W\jGFVsXmsfNObfS4Ks1mVF3qs(Mx+ _Zޤ60\m8C"HԤla@eHYr0,8 ,r 2qLM}fng/˩`kk'^ZַGExi;gHײ?QM?5l&'tBkى T9iS>Fk{ Ȩ1-6Ǜ5+g3wu,֟)Xvr"UGG4eBe5 #*VC^ q6 r#}ۆ0;7i< 7*$+ 3ϼ!UK͕O4AHdLP{ѳ2Dm)W&I'c aDǹ xm$'yMgv5E~\UtmʹSi}f|h3 i^*6wTGe/3Hm"B `GZMZp<(0GX![6&CcVή$&,./dEe9@5H + "GըJЎqb-DM8&5QfMXCEɆ1rZ?v9w1*>˒?i\c _UA 6(^|,Ko-XFԊ@"pLQO0pR8!G2-$3ԉZ; KY eͰZA8vRk'`s]z]q( r:fӦR%t58E’&f KA1N=It~x"})iL:T>dFͰf#ˑBe1!sVr&f$0Dp)'靱Q1A(n%-0/5!nG~U"|ȱy!esD u"+;O.Gi2,6K,"h %0Uq0?hCɚYo z;. koTujd_qHy@Blr`Z8U9'r|TE<+TS^J}TXai9@^C|7YKK(=2hsqϳ ɀbp5(1aNn0 lxt\6hz4p"PVNz =RӃ.>wRbB?]^\&s݇+W?}7|oXoJx㢁 <l4-jvM_Ԟz>+ZΥT'T'TG>Grj~ zq(EY{QnkA^Z]l[u"%b>vHJiie5grȧ\ \b܂`7fX­g*g^bQ]E8p%u99Hf:q!iV^#B+Hك‰i|J-8iw3"v=؊iBD3m3[^5[ag|u2O"H.zc2.Q%xN'\!qԽ#œ' &@^Q؜<7F$T9.#goY?Ijm7{jnl)jr]Z',8ոe4 >$U˓&G}- R{aBma7XX_ٳ֟:Kp@'`,ee+C"(}t[k&fL1x $Œ32&C#e$䜅lDDFaiI,,r2`!8 ]ɷ}SAB [Ís^\`':aKf[r,~z1omspo0s?7evO׳UIU9Z=ȝ9u^m5nu_7bn?ZZWԺi]M[?j:;gZVԲqz^6X9畖lno~wwṙ<2{=vxƃQN_|foTm^ҝu'nU˭u%ȕhV2/Te'&VJ}4/W W[[0Ud}JaS\SVN28C%!S.?h=wvs퟼9h3·FG R2Ut H8PB"5P tсls#^iG-{-cdnOs[g׭k91pr9u֬=ɊR*z 7L̂d:mH_Τ) V#oпǟǣY{lt(̺n6ˡ&\JkJq[($N*1Ɉ5ڮQRgͨkQ{W@M~LvxF^2 ?6 uMt@̒tR)tns^l]\}e+Ëf2Wg*m]SdƯXba6❕,ڮmV嵷8QHʧTٻʆŊMMѷݕ]Iܼ *e z1 sG 67zs٬ZܩL2u:&\]y_mugԙNO(w(nOlQi.q'7:OSy;&fFp^'\g?,2^Z\$-KHnmSIl^Z6Y Njs>⬗C Hgk vMFcҙњ/9݀~ 1kOz{G.Ldue- \t9&l0IfGQ=9?+>`|@p}zHA~=:tQ;'e:cFClP*Isb҆i5FY0x* l_ 5>X/@{H]w-L\ jTv4d"riqz) zCB%!ؐI>/ 6e`Ϻdžl٥ j!;V[LM{e p]o+ءESlS׭`>Krv``T$u>6\/m ^b:DI.2{DNfAP`zdPΐ}qг 9VtR`zժZtRMKJd,v^ެI9u ײ6?Eg5&aUfEHRy;VFq-lXP[ `9Zx>Xfn,˚KwY( )XajՖ_֕pnH"tM"A 01!Nv0ºQ^IZVZZIs zŵe]Yy2 33j`ga$M^X2H8w*$aCe%2I,7 H Vyv2zeDHX! P:r&ZB]20w1 "nϳJmiUd$lëx̪2覆*px~T3zv|8)1(MbX-[f;|چŢKOz&BA.PP$5m;g&iU~SI>eu) 7 ac;`zih 7Y[8;Z^֌x hŘp~U*m!"ebVF}n+{Lp%l elN>]8m5C9rtR6xdNe"3o'Wh'WK~J R')~zo@}  ?fhh_9v4Y/Z->9Ұj[bvxg*̗>lٟϏ^Tll5C/'XXs21#2= ?K,.b%0xwQS{ʴibS:7mdſpMѲIY5RiBYIEJ$.E2"XdO4nZ6u#=R;6RVs4rL 792Xp ؀Zםh%мS xbK=xa{Lya;ܝuK:-Ð>EyYh\xNeE\&$̰@23ےFk`佖~W" md`4K>V_bbͳ7ݡkkD_wrd΀zk2^_{Ylp}AWL1YOΦm-* ƫCV\IemR{<=lwr/ulfcr8]G&_-\7)zqiO{}iJU/'Ǜz@k<Ճ.hofdq=Z:-9tOɡ":!8)$m%FcMBkBcݭS@MiIzj~j,=l_m1 rO &MǸ ]4Gu  9ra{d`5 KjhTpuJHeD2+x47lVĬ8UX]èuGC.LOQ9C6=jR~c-[lo][{ME:{_|2z|?V^+ղ=.bڮ;o/<RiwS|L_OiMfx.`NT`SN2`= WnC|۹rlʱݨ+G&hL)i\0& =#TPL+4lD[#je6#iɐq/\d3}IT&@olwޥ2n~C|e?OBR%'*3`VY WLVh,:#SeL?5I@&tx9JE!!*a6,s mYiF('/d~v@ :ێK_M'MQp<NHMْ;6 ?>Rjb״N['jgfaZSUٽ2y4 -khkHrDIR\hU&eDl ҂sgRR}kȹ_3g {muY  YX),& x?Jkv9{ j?O/dzgU$k"D\FM*H*S.sUm4*%MޫƖVfi(ΞOEmҭ!f"ߎY8ruY4g9k0k.Πl]jڃ{b`䉁d)gֹ0XV`T g4`2@^thH$dQ $d8b3O"9hYFqIE1F?Ոg8hĻ8>8o.ɞ R0 Me ̉& P{;vEnY Tتc5rb; iMIpC3e?*m}2K:8[h+-J)A@z~v:}4MBu(jS팂:Z+ү@ ɐF—nA^˵18ONR eVK)C R0ܾϪ:#k\S88:`G#_#^_lCg쿨ԉG$%:7>OI}p̭Hf.pj-DԚ$jR_ls]aNdVe02j5rYʂD*%N@Z-.@i#P0gZHB cQ$$!M>dEwU[QwsEe`[2٪)TU:f.Zhx8MSREGVTL*8e0vl9ȉ:qxH݇e~{^?lzzcwS~r͝ϙ'||Yh/fDE}0)PujjZ'4j2@y*dj0vU$QCL: )tbšѫL%qmZ Xci ~g}z/x7ZלOZWxQ[1q{<;?6 Ժz1h\R.E3B풱+I{൭Dm,^ٔZPʮ*w9FTeosVpZ YLUu.+h_m2tcjrCϬXB@~vsr#|)b<5! ALl=Iz? /3FaS.\DGx9^(dVLASNmNQ[j]*GQrr /A͋mM5C^kzUָ4^mO^ۓɍϱRBI7?7†0Z՝?[-Wo&/svћ5i|7{׍@'⧉rڷ|gfyh ]3o#[u+O^ 9;9Oǿ^Q҇^3k??b_F|s}F\Ŕn,wu9umxi&MX_=3RËUi iuտ-q4 5j>V]pttv3Bc|ҡTF&G(ѢK9(k.猔"!Ǐ[W..AwtYnm\Mvta^ȅ&SXe#pyk(c]`:.v!uFWPjC> Շvpve_9LdjI`JdU%<j zRA%h &gr%'U'*@> OR>+3*YA cgbCpeMvr 4YeZTT<9ow4p_mKA׺R۟s vpQN_>Nm7)OLߒ1),Yc2D х"֌p%*\c#FLs^mJnP=k<`rZIPAKb}.?Il=-׫ICOǛU~]F<@,,1FWW#6;bXIŁU`HRe/,8k14XS0QU?C*;ͪW~oM3+elVy9Cd,؄!S6;q:([VD .E]CQDdWÁWʷ<`+X_ I+ـiմ(ʘj(1Kz Jé i`vX5v}eAR:Y_"K~M+gHADK@NHM^*ăQhBKcػW3kҸE< -¥zNhgۄ_\@su_ 1ӆ @0( xM]nG8f)-wrv{d÷0ZgCPt-EbY92U@.\LNUcETUU@au+/EU4M6Ⓑy4=kMhy41l) r Z܌Z%,$9*f((97ZA,ON Mc [WEcQyPǵ(޽MJ @Jqh3jpZX?$Q^YC5:-840Sil@xi#xЍ/ތk.jŐg li%6mbʢu^,^+1`,n4nw)2.BMSUvqU!QYٕm;HZ* e P?jQNWdyT7[>vO! YP^^C(FE§?ܮ$:RDEв51ҾK;Re0w_{5LqZun(ݖ l</t1IL_~m6lV}kW ſW?$?$=*= 70z=0}.9p&n-}]Φ{7>s. YX9whCvcIl1ZjYZ ^s8bSŖ~[(&ɲ3lޘ"0UvVZo7oOy:`+0<F9D4Ak&$~F(w|ɭ.cr6H Gs2zMw&q0رR!zQ9{Em+QjeͿk_ f\u̫8cwvf 6{Qیr"t ubPlB86i?fTg(veZߖ&7EOUƤ Iz5"c]lXUӚ|+Qw0?f ڡ QC^P+V]d7.2E YʚD]'+ {zvS7-B@=:X:Oh7@"L .e={4?/_lo2&3b5swM)ވ+\әQ(19g~]:Q,6-y)DiJ"5j9 FeهGriMrQ~;9/';Z6lyЊ?Kgknja@cWf|6//0UW$xcr8=7 (Pܼ]koǒ+>%XԯcM 6ɧG5M*$e[3$EɤHFTwW>]1DZ|5L+NLݰihٴ0A4.;~\vU82ooX^b0Tƌ3$|m?3!ubAiu6F?.;C<u򑟬5?ٹ;DP7߲EzpF~*E`翞gr\0]RT7W\MGށPOۤ~;Omؕ2,F 1y0Z[LJyQTB9 e1A:gi]y_ry!+bE%ЋNq2X}"X/+tx0T{e7cdi㈩Ⱦcݯz^bj֪ާ`s^ 'hY'Dž\\o'ovb.m [d:m}8ʧZq]]Qy5x +Uɮ reUdJfb,v2t. cϺ[F Da $F2BL|6K2%=0QC`C3t}pYJdNDA&+&o.s;A?_Q, %$'b, IhksJ&C0ZD5ȔhvjbƉ 5̀LJx)GT&DmM'*ګRRNvv M:0bKS)ysn.M:h^TWT{qF|OGV^:m gaZS"``dQVKa54rxlKYrAЃF &eDlLk93Tؚ8#c{\5,lM3Bh UkJpi.ۀǜ/q1ʏO|Gl&$ZCdY1,4ڨ $O1PoPh % oYQ7c'dBU*3mǬB92uY4enM;L'iLP[Ԇ{{b`d)1R>)ds!0 X ,(B3$n ,pE21CȊ aMB@AS#0}I\H,Q5qnڨ_` ""nu="ޤL) dhNDx K5 RGiE]lyQF%2d2J3!hIH")&PFzC5qgq4ԁpq^-'W:[Ӓ}qr0=.xcFcѳIщ!rS\mr0`u]ִ+xgGug55N; t#ސEy'EG(J,v2:[&%SR x␺Jy*QQtdAQ-('6e $b`e[p <:\{$UVݣ.X.~q.+Z)E,B0Gmm',hM uR=v6ƆWs̳kbvsӃgKO`_U;8/Uhdt3d*d@InP[+f$w"vY0]̈0ɘTl%^ex"\ !Ee-ׅ,TndD}>I{Jr<&" Z.}3|@Sjp?wF l|ތWgͨOixq֩4CQϨsSj\q\\ºC+Vm$(\`eW/"rqwa$5PT|Y1$trDx\Fq^%O54?|.+.ѫߖ,KƩ)?|lEߏ!sw Q}gw~G̦D`cD]9yťVȒH[Oaa~CY?Vu}$)Oi|z08"q-W<ݗH•+Xhu0pUĵPuJp  W$0Á"C"1]"=\@N} jR:`ln8nh>oZ8/E%Ԁ1oo߽̦D9Y~*.*ǡtCnX.hX% ,@`}tLuH)zVa =+.e͋ZZw|KsqR]-Z._f1b6O+%TD[( dKqL;z@cPPX7|!i(֋,p7߾<[囬)]Zp4pg~}'cΤ K[d!*KK4$q6s ![&|u\A_/&l :'_ uDn0ߏ_}>~pO>g9>H^>L?Zk 2jUV!gtǣfs<@"-+uC:B';{{#,}/}]8kKFw3??-?||9@?VӜ-K~l=} ]˷3/Wwp\;ڷl'XZXG}d+4VU trR Sy6%%N"s%uJw#,due-dGN<DJ& x 8-4 B[JYa܆$f2N*3^ie:22c[qa3ZI+2>_t&k{~,󓕇\Qd֯J:Q<={NtXK9/XGkΔU1SSL5Uyn]n똒2YID)'aHRs8(LQ<­s$^ei?y՛n ݜ|P*z5YzfJf[j =NxR93FLRWneP>e>\k==]w칽dy)A )9!iudnBd)qR&)hQ>iKF>Hw2dAULAtdNQV&sk7>lHaHY" ##Nȴ2nO\P017+t@؀od^9Hys`{-`RlFMgѮk un][y ~&jK,"r'oo3qݹ8sYG aE'vŋ␸=v͉%q|;EO!p/3鱾>v.ĻPkPH8j[D4g$jaPi.E&"6F|IcнQ ^ܧSu'c-N# s nGf~8FC^N7?̕_hhGKW ׳} SJq%zT4b& )ۘQ1&,DDH 3B9@NG@QzD8 "$YnĸP! <4p Rb^QNr6#|%-+)X%VTmNk^ZRZd%"V=:j]D!ǐ $@\ME8p#h<d:{{iy>7O3\YMQ4Ym)EJjk_r-8D5EyJ+%\!hp?3rx30ZQy!(5b`t$Zߝ֟d2EsJiѢ]ԃ= ܘLvd/uI>%j>g㏫SaVAݜ~QOGarOyir32׿\W~~q~y]M~~{P糽OW `8gF0ɛ0|;yo*Dg&ʒ!)J--lͼ(~n<-^4 ~I? )C_~]9\jqIi•Rspy<odzhP|3&kyדr{@K!JDH'_)s 3X %{{.SP_KQG!vU=. +RqQ*2: TUPC2FUQ|I7Sx9`5M~ͨF'{n\Mr&^5[)6I5<_CI6Z/NM ^O[\'z˕r!ցo>E3$w"јYn,=Q`lI9vzsHff>Kk]6,V.'Rlb-H'׎?BW]Bp.] '> 9^igс+.A+Hsai[lj_S"H]C]?X١9ڏlGv&y2۵}%&$< uER]>s.珢n֐O>p=[{ߠm%;h{ I%)Hjil̇vMmKljJ\m5 Wr>Ϊ#d=܍-5{ q{Mp=~l͍Gf NjZ;-ܵ[9D;=~?m(&kDUDs-Z6$,;'&j~?uS*f96TӈZOޑuw\ |''Չ/xb¾92yJȤcNI8#i&xnR+$"^N W1.X$.^UͯO`gq`,.*rzx“Ǫ‡7/+Wy?1-WհʺHe,:uy?jkX2tie$XLjA5S]喚҃ + !+Q+ӯt]&&VQ](gh՞:#Xsz4‚Sѷ >JM&hY;d}^~Vھ0m(F }3/-7& i}my;0܌[,!?"Tb[ɲkn_ ;ٯcxogiGtwri3.EzBEeGQfVk:3zBM.?JkŒiI-U9kTu`Y5gtVX*(%"bo}eNB'DFu0" ΩVJG31H#DI.nLG r[K띱VcL"^ˈ&Z iI=:cgϢA[!{~codCOHr ֎GĐ*`AV=["z/#R'@NK#FxIQ! bxT^s^`.0 J" T))u٭u~J)ta׷r љlz{=AI:}ۃ@&g9hǭ$wReJZv_JyNQywL ̤҄*̕ l$D7d <Ku6RoqTX&P+ZҎ7hS\-&x#32b0D3vv#cwJgXؙdc,d==hz:'˸Dwwʬn+״< &7a5;ER%!C!ƊcxInMH)bSE#aHΞQk % 6QN0+DR:хT#vgFl;TP38=j6N;P*ˏF*xɀ*ΩQYw0q^t[7r!c2Xh5> 'H `8F9p4c<쌝xE@mSAPD#Gmo57`XJG3O(cAu#$6$74gVpz0 IF')cD쌝=: D.VTSLJŎۯG◂8Qh@[ 8) QyC*a5:X/{\<. v&0<<ss7E:"AG?>SJсG?.TCw~ײvؠGEޒR JJ5*^ym{̐@O;$JN%$@^S3yLdlj ,uj(>䩶?(ĸ{ *'ñS (WKMx<$8J;f2Rk RӮ3Jq'HY+թ%,)0d>uvqyN 4D˴ܣh,Ij:#J*h˩)]3Sj,RbPhJ7(bv,Y7J7W)nmنӋTxGDDǤ"9EwZI%hD0!I0P}{Ɖ#ʺW)fFS=!p#UHrzd0 aK 0"{nǮZQ$,3DAiíK!@"kOldDWn*7V!2`3WF, T#HFT s,4r0Z {o-2ܞ΂Pt'4U\+ryQ6NUQQ,8mmo ZW¯Nog\{fpS|׼C.Pχ3UÀiSuԯ 3Ňp 糫z0&e Y]9s 1/6@]\ƛv (,of@pPp*ƒ\qoL{/aVLnR40 f!4^Թ lP`uGiv QSy@]ahAfTu`/2A* 0n|Vb?8Bm ź}*QdgX6SDή%Swf*\Y5E,t\[T9ۚdtH]"4]oa ]#JDϲP`q'&0Y'y6ywy|mUbmB<UIwԈ >0&TuX .J>}u٦tQ2 L4{&F>E#<ҙ2́\Ag U7$-+{;Q5<=oP\-֙hvuT)Yb 4¬nƔc /u2&܎l&IOxH~To"W=:?LV~L *x)yRE]q]K^T~r(jֳcgH|jP&97r}Ix#-6pm6ޏ h^փtp"X^ ?:/EM|XLjQtgf鸪I5.>Nۚ ;?+@[Ɩ^O'E׫?1Z!irBBٻFrWyJE/5ds9/.%E\ے-{,Kۖ4xQfU+MK-#VAXoL=Z^Q[&5"ULC˹%K!TR%j2T-Wǚ| ȁa.V/3oGZb%\aa^a)dbѠ*Q`ddlVR;6$@^>Jc@[7AY ǁ:4͋;Tt"7ecs`v& CD"/-!AFRqf9$m  tɵH_j0y`k&c4W $ $4@-1زIS`RlWU[ZDL(gԵo?iolS19LLŜ9s^_1}~r5Ag{]myxρw)Vb};&1l.wIx58Vl8jk9:="=X #ZbeNY~}er9ʎhf>,s 2o.b(/d+0Z5hƀ&$ ,%[wz:2wSy!K\Xm_ha *kⶅN7c.l? `ÀWWWB:jz{bwd,wخo.ٖo cl_?x0V#x0HqkLa*B2kR (G͈1[sM ֢u j1P5g".೴ CzxCcz'0g%?NX} *8=ky q̟_/w+آtTE:U(*ڙ ,VHd VԯO"} <<1aDB3)@55X}R)e`/}Ev ~grJUP*{/=8E_kl9}WVǖcaC";XÃ5z/F^q~h㭮>p;]xn[^z|ܰ{~$5W_ݺk DIܽ54 lj;zr;J;JZJRjm]fJ[]$OIIVP @[[o[ݿgvp6% ]wvh/ѧqg,-ѽ?7O[:Rc^'x/buvQӷ KXܱ7-|au^%/mKF Ob}q&FTIYs%I$iɄ$5lP'#s|&9qv>rzySirb<h@&eMr rO2)@khBAE JC{[jñzk!w(ZD{PL,^aAJ\L($YuBT:Ki[ _r[2EwJ6F3$*H.B;'7r㪦C&ϭE-#cj>Uˇjk3o1əjUdo@& 5i1-EMr}cA4e,Rh~M5 x Y̰kQ,K$UMW9k4\ʒf[Љ"s$e(}8om,O~FΎrc3{9\N><ƨ|X:9 jA_2X(V@;;UY"ʜ0Rj "Ч|Yw[lw Xɚ`-r]b͕w mU+zZiq9IQ.'wO<]{bEzMU%ZƜF&X2dRE/s8zQ4}Z8{=(_cFpefԗaG,g 9[.tR*]S6V9bA"Zt 瓑: Qpe6A;a>͊h=y"|ȾG= ߪ.JFsEm [!@ƨP&(G6Q,4午 b6=PvRj9"y)0 QiB-}Ke\>aѽu \-)wғĘ=I*TCſH9!cDp}|r\&huvល@2&.sc4r^ƣv/s ѯoKiɗh=>{Z4*70^-׶;an:"R\~( 5#WGg2+Mg>,0ï>|'5!!#؇(n7hZ `$h"*aa_DVv'ItJ  x:0\Cb4j:vQIvhޠEOH]1 ɨFjT:77R7rɨFډƑ+@ +,X8!u+M'ƝjGWL4H+wSEz" ի.)d_hDt v>:&l6,3+?|=5>OLs#s ~7Zۻ"!#lti1Z`X,9>bL|*DdikK6.wwS[.Yd2א>WȿIG˸UDj~s=xva~Ü[W.xe'Q&EEZ\؜я™u^ 2]hHC3ZŗZgguxt7^VZ/5 _5HF4@QV <(MHBɲ]uDVѤV>ySU1:Q:깰Sol/t_ QR !mңsmi<}~rf{%m&[m]*ÛFv424[=AN)p1_̾ukV|xx+YyT wJ[#SɶNL1cØ1al7B(`b%VVKQ&g.*;J ӚjPQ7G(3D |![Ѫɡ5hrLۀQdkP7rvhz.pq5]!ܫvy>}5o{z:ZHr] f«Թƫ@r Fj IalvN Q>K mdkI Fa髒L rlL\78;z$r~@h]\H4 6cv&;cgL2$gX|sXmv Wc: fwfUJoSd% Y/r;BΰFwƝُ;ŗeK\/󷻏ƃD yC)A%`0Ve ;uvNƘ-lE5*A0Cn)|k`RQ%k]\ 0Į+zHt^췕-dX T*Jʁu*{cgbS$8$ tT@bplZ*wv0l\6ɹov,:sQQb5֘hLBсK*nsB)cb"׀ƥvC՘llL6h0;***l+yd lid$hdc#E+7mk:﷏gj ɠ)2@Tl{(|D3QL)ʘTτ& gBYD3Q}e z|E(DazQ4uԴ3Vݙ9Mx?(zgc><\fs5ҟtѣcjZFҖ0ACHAdn FAfGW\ #*.Qty+"Wfh_ QmȘg} d{ŗP՘j97LhFV>+*\M+<)AIf@zďAV's!~:Y>󞻻?/W+k틱D1QLC:U'hBP[t6@y*$;$v5a"D1c#Tt uƸmY޵o7`Njn+}޴<؎bJP" (Yly3yW^M^J":šo)z)zih5m`$&rlfs\-h]ջ\eCbW~Gщy7?6>6d9dsgg_?~ʶ>1> AL.7O_.7?3^&&+mME/'x9ˑKAƊd :f'1aUH5.Q\T:4!G%>-2]50R8sL4@F.SYǚ8rɇd|gxMN,gWo-m!⪭+:_nǥ[}[^CSvoJfcv)|JܵtJTm zq؀nMEԐrmo(jY=vΖ{ޮ\ws%PVrX?Hxuܽ6~nY={{ͯN y9zGG\f顝󏋕55r1x;݉ܦbsw(FByhޙ; <$y[׃lYqO* H~_҇NmK;I ܗk;Yc2-jjl-j ph!ka3챏0% >PK,r%(h5v`D9"Ǐ{WGN* YnϟCB]v a^&!m_&Ӏ;9QC`F#3]Ai pQc7yv.p D\*hc]!Z],mhb2mmU}tcTI)ESAբνW5:Gko rײt' ڽrVڡSӷ"G1Xɐ it7Z3A lI*zJ&Nr>Z/Xs͢ VbfPNGPUIYӡLOJF.e*2XB ehWŠ>9$>κs1SO6=nFUXDψMNܩ6"P1U4z[kG}~NC{:j&UjDYLh烉WC|hUZ^M.! LC d'q,՗'Ϩ,{W]L]?|ekW毻x?>fVؚ+Z*sAITjflrJ't([R'kݾ,n˱_-rQgPPW,sD!3P6' طE&c$7Lc0cn;H밪u彳$e;Xs"KՒ^?5 HMATlB&s'chaFOW=lԁ" ARˉ+T>;EG<{=ʓ? >FU"j#aUtDUIH`NRatrwv'Gyau7L%PhG7񙫽WչO+Դ}0e sVQɽㄎ^`{E*հh走i}FU{2fbUy* AͯgwF|Ց2CVmsA3L-dG)v{c`tc:|7.&@QNk蠃2ѹ.z̜e[,وL$|L8fbXO\ZIuU o}Q,ɚh-u :@>3C(њ"4CƉqZ#іK:$pF1&toϯc{*C7#.(,BE٪t_/j]!Q$}Uju_9H<+> ީôۧVxGwKۃρC>o|mU|rU*FZEL4Ah-޼M.( KG7uIr6[rov/4p*qoH~t|̈D8Vd&O|M?b6ްbwp|gcyQa^֟Լ¦j~sJ vǶ:9-?˿y3u$鑃8 wܘ@?a;3߉t.kox̽^xbn-7 H/o>xZl5c /7"~zM =SmNz;mw9adE϶8n#nJl0fR6*p6E3`}ouF0]ˑU W5Y{EE) !CY@"uK\Zv;6):fKbCG\8z_ɡJmÏi wnuZq2=@ztP)"P‖@rf$ RD~Ԣw ܑ(n5 }IUoLL`(k Ϻ&r*lUpx>MMO?wzս~vd*_SuPK źk}ЕƅSR&84d*A]_5{]MyƩ8uw2Ů/cOnmotb@ As!  DNm-x7NiO[n؄$9bdWGA$gR*N+KC[o:ˆ0 AE!Sq1'O+.xA]5!(e}6-ab3AelJ•R$Jyj]Ȍ(1w6nam'cj pD+So|̍kM XLbQ>Wm3Gd,0[m-oh]] JUWR9{'IPɅP%&UZm ~]_Y$V֗Y,Aw ݎ̔뙘3&/J+"}\ocp̀AB)X'}jBaT2.ڈ%dBucH*^'ʼ!0d$I4$;1|`FA(<[ {%br ^Nʂr2L2^?J$qF9åg2YZe۪'% W]嗣ql1iXN0^Ң Sf *7mov}=n^ƸBqkJ 38ob@ 8{F%6 30/8atǗ&A۲(z갌wXD.SKQxӈd4YTBA`e"(UEuG5WmJpx/soj^5O t@)ܤj@ ej /}V[.nlMisw "L-(Oc,[W)uticҲ!^2a_~Ҟ<.*sc-[-HY / Q`XT0 ?%+cbK}v^aȂ[~LOb3n`~Š&p)Eb. "Ш}xqċ=.yzu"wMW'wsߞw˼ YX dNf RVG&4g5'bm^y[Qz;&= mJӚjjG[>_'Mw3[ʭž/,Y7M7ؙoj89?0Uu'fi%'u(4ˬZf ܮt>]K_NA]t\?c79=ڗE+RPq哧ż/瀬*S.YzaJ/IGfIؐ. KI 1PR༷Lb:S04YJ{vz3V>pv vb4Eƒ\F˨ 2A4'2&ǨZ"jg'dY5 2< Ah@BpYR)& IxP͸0Ą58w+jJ30K K* D\&']/`6Y.;BJɛw{mEGr|%;S?ޕj1. ǔp icYXe ,>IEH*ytI2]F ' 3 '1[ϢK,NQQR $63nflVi [}P̅G5"lb񎀜b x<,ݝ_Px~qWFNڹ58waԯBd[}Q̈gĞoxoA){sFh20sB"zT# s(!JA*#Zޘf2xԂe}0")9) ͝$2#,&j8/:֤d_^lt=/xQo$GA[eR%9r6007c֤Xpl@a۟U{>m#N7 YQTAL#BҐΜC0j@),٠: J%J2[%O^MF,KaQw~@Nk9Qp|,-kŧɁoS' 99=FFX;CMiNh S(dq̢n$n@hQ_ņ GxeAAmM6)-^]lhuekmZi'Gzv@ٯߓhŘmD~Y[ah yJqdkm}jLN!Bݡ+@=vl=]=bjl#w6`,6wR)JD]DO66$w{,4x_fX K(1dp/?DAs5fZ1WUkB?6sK,1XtGݧ:jSb2q!s; \. ZݣAFh`5+lMg ;q]!ǖ QJ+]`λ^p% ]!ZC 'HWk=vi{ Sѝx:CWu(HWr4{dWP:$ 4!*w|H%7C4 E.c]iDˏ>h QNim5Ct%;V%e B;]!J)zztegK!XtǺB3=QD'AW7t 3$Ou"vRyV&p"΅[͍?O~[ZίgܷtB4t-PPj[Fikwmή{@UVw3mX^R]oq37.lěV*%*tv^5m} 3RG5]…r=l8hƋ '*JT ]W}Qku@jJ5quk +j@kcˬYm%WEe##zDr>{-;8a?.?_k~3 ޜ!ڙmrLl`2<:I#Ty+'>Q JfeU:.1}L\4\C6wm}Vwv9a 柲> >ILVpS,IabiVъd׀ o,jÕK&QZ#oזy5hЅΉ+΄B,s1R5ʅ}?fWgS)+d{g'TUQI ƜYj ,Z@+S)h!l'Z\M:%H(3xCmJF`0d ΔH]rgW3-trPAt Td)g)C+%hY S=XR2iae%B ZtY BT13(H3ؘYeo,TH26Q:j ^Rd54aH#5`i`]k9CC $57^;n-'$̗"2݃66pBHY2dClt)SQX9/@>R;&ƈa4BqU7*)$*'@¨ö+T$~"bjSőuY*/`TJwC4*ܽO̓4e7-GpNTG+Y7̶6E5ɍK6J |iԱ).ɇ5i709Fd{r5nj5o3|1!|mCrKUTZJHQR$THT8k]6Вjt,֓.FR-+Ɛwr26z=򰡥VnYB U|n*H #-Ġ;*T{|]Z};a:ir@,)MVD`%R(- O9@EE; N#4vÊ#3ɔ AL7ZuC Ő ǖQX[Ck9 q'YeV\Zj莬Nc[ 3ƳR#u˶ p r XҲ5 [(Ιa 5hSQ5B-E9􆣲 (lj= RL1, v`[]Ye;ՠR5rwU #" +lBGkWNKYbpPTZDJg=l4_-P^f$߈XqvL6B@F=5(!ȮdځE7| {- ȸCæEP ^0,JLhW4 ~DUuAS[K1 z,,x/0LX;*q b LN N u(R!fBV r6ЙlC@WE{S6YRPhq ,s4e`jP YR7-J` lBR! ƺ8R| 13Yat0Wz,I KV5zP":,5K͐(ط6l=x_ZU/(|XgF\Bk>Cwa R5٠d1fPTԃ,'$2s{L|c[/|ZsߝR| >a}XXQw L0B6#&V&3x:paP\J_` &*fFVXY4XKÝ.Ks,30|dH֨Vx4 ;< cՅE,TG7ßhzBb9ʶMZ WB(NCkd_ͺ蓵B,UǑ+/b Vb=ѠeTA!bQo;CIWH%Lep7k ) `LG!Ey@r%bkVcCYD סPVv%m1E5h3VF/7tFX&>)يG+>na6(I@̴LZU'J)C2?Amj Dao<yg<\oF,ʰ*+ǂr] B`vvf:PF._t XgHY ggMUk$k4άfz8XP)FC;T82ߢvC6e |4|Vs`/ ֛.rlH4p^:H%-0 姀FSax hN\76xk+VnQtXHh׃*H]6 |tf҃d &Sp VRX]ے>4̛C~_e߂6z}]_Iw/~wo_c(nz˻~wS]0.y-?n?bFQwn!>> 8^~z_l[~xP{~Om>濯nnN?޾9;mk,|\ Wof~~lSl[h;~ v;vK81v*v 5WΌ>#?OX8S 'peO6J>$WЗ$p%+ \IJW$p%+ \IJW$p%+ \IJW$p%+ \IJW$p%+ \IJW$p%+ \IJWO \%4SE /\ \m\ bcO2pJW$p%+ \IJW$p%+ \IJW$p%+ \IJW$p%+ \IJW$p%+ \IJW$p%+ \IYWk.ް\@OT;S>wya40CVh$ɇ#h‹qSza: n2/g={!:_]tEBW_;Dt-+FBWQ(tutelH6NDWF4tp4&vJ)YCϻUiV~+>yB/k Ml.Qx>J໏gEf2r띩onv;}k}Nn>\V^_x67o߽W' ?iRLΛ+okݒ~[m=YNྞ c''2gJ^3J`$ޘ0 ]1z(S+GMn"rlk? ,th<L=IN44tp4~6uut2f˿QwV<}_[Nf6ۣۧyw70;sL(ټ{ih^`rm٪™ !oqg)t ]44MnZWOӌ:U3qb\bZZ9;Fi U5]%_<:mk+F#޽V78c%:pt/LWG 8q(6p# ]}k)&+4tp4t}RNW ҕ!v;?~y~\Wմ~w1^=帒O07W1a Qɥ?04lOz|ش^-e;L7[Iۦh[c4z$?X& aOvMcKUmq%aUdJ)e}^^{?-E+}^^z(SjHCmeK9q-?/V QxBsh"GVi õnGh V :ܷrtCW 7NCW6xWQ(tutoH{*]-K瓏C_ RA]yzLDW7: vbzbNĽ훫y@dQZQ~@ *4Qd1L?W?-OFTOR03k+k9;h[;]1 tutQ&+9;ZOCWQz%tut5&+4 ]6ibɬҬfBW߆{~V{@?S̩&Rָpp;L׼8y1\giFkWOӌOى*yT%åi՛`$%tut Lpy.10ZGk+F#>V&<{tu(^CCk_hq(ʞ tSV.NDWOCW ,th];]QXzӢtϾsAb/}Юh-tutEf?\.u`ީU?%<2B;$B3󅎚o?(2ʟ]Ys#9+},/GllFtlT?V,i$Un]Ӷ\u\dűa#Ps#sR~y,;\}J1͇Xv3bE^!G !ȍԻ>Rd`6ޞ&a3)l.vw$/%ox'c5Mv*Y5;/kiosr6umGσ38Jɱ]"_"9n$!Ί\H ) U ɨQfI9"b J`(&-`du vFa('h8x 1Pah7{m67_p۷[;}{ ǫoW`ex [9Š/[7f1nK-z&> gף;,Ef15/lk~>A)2)]4Ӓ ꊜf`ա'& {ePNgDP@^pۥ|{GuQ|TLHXC  CY|*A$eo!%E% "m {LrF{9;V(@JHƖ*Qًk,fP RWGDzUT* 5vF $3r\  gЖ+ƃ켵1"tr9GY+W_NIOǿcT}* 5҅Zc,Xl4n{ֱO,5ĩ.Ek>BO1K(Y哑P73ǒ( zhkB ͊CΠV{;^1y0y] ĤJ6lR|5o'ƒHK xA9{щjc2*#z6ݢ@m1֝Gif=;E*|c[S6'hfBd0Fa%i(恬+RRE%X-1‰a{xy(l-#j(= YHE@O)@VxҔ`tFƿ ZC=Rmg(mt7NH! =泺R>]ٜՕ-$pvv<ɭ>\wŻoUi}[ |'/}vp /@=e^ޥToR?8zuਭUݏGY=:]p@b#AZ w4.YJE^]5חKvOU"3cO*Y+$[/TbNĢ>pB%$J,ZUNQYf*6K k\T^G'*tQmJ-oq}>nDq`nv^_&QEh b4{KiQ!u*-l58dRA0 6C LAr %f s0 =+>)_2J2jd*Itu9%x+ x:ˮhw+_vyS}Dv5,eFb޽FEi|4IjpS]Xʠ7 A}A&[['j=jIJ]$( im=Y'16h5X T9J L o)vjb _Xlw]YS^6lz }C75/UXOss>w/#ZD34` (Z7:fVS=e/O^2B0dr2E1",p&ܹg!"W.a: ;.Ӳ5Eyᣗ!g u1 bK&0X" &Zcx9GnN,Er4޶;rF~Pu<[޿Dʭ;J6j xz x2oܙ 7Wi`Y=xMTk&Xk&R1Zuí͢y:G.Ow6e-%CwwI}D-C_}û1Q9.:^6$_xޓfWMM+-iyM<~Nio [omJ9N`5&{/'FTj9#JĈo|q--]]7::Om|07cJƓl$:ETpO8O")UHB%jo#ڔLJIxr*hp& Ș\J.lElLxeTJ?'t!AD>jmLp# nLuEΑ>6=!`XxRSXAh HI:a󠜳Y$Zc ݖg2Ȳ"٨ !3r>`^k/RmyQ=Gc錜 H(rJb+eE5sEBeXL0,49 ⳰H-9zay a=z |:++ iv,!1)Y霊PѸT, Hhgut߰ ޴[t~*%F;cS~g>>LF}/4GH5D:5:&:4oŘ&fM'yT}sAmJ|^=ܔ]M YX\Y"PW$LRb5ƣ?/V9 j*9uasYlplp6ckkY|yH@$StU:$9X^zűGGoHoF1Sߖw4r*:3cHaK<1с5؃o4N.B?J~/[60t[g T8RdL BbhXLE6}qevQ:D4l X61?Nvm٫l l.[~>FWzbW Zo٫ Z)fZ6;CWσU{8vSí5OgTHʎr}{%A(P-JE7=PTBbGY}Y Dȼܰ 00T(:'o \ÍDEi"A =F<:,7&hnD҇ T[Ӑ 5A)3ʝQ1xΒ+tF.4w$'L)@Mޟno7;=ZxY夰ϟ#R8ŝh$S$$ÂO$1 $'4\#XxfizڐGRp 'Z~4 sU_9AQR8(T,< OV׭d?Ѣe[qqR N@?iU_>ӣfМU*P%qtl5zU!bsxbَoOrY~q -lMd/oGlssDl~YYϫ*92W(y 0(z߷ ]̈P=?P=23GxҸ6rkܚ 4/9nsp:j;mD%vQw qېZ:jҰGLma5l[ͷ9LsuoW5BʞСBTR*Y͝2*PWDWq?EFa ?9?⿥#Vou}i(/b4ǿ,P;3 y=Lgw\EtOb{eLRSt򂶪Eu`xNNx٤!뉽ܲjGNk~& {Ӽʙ'/<r8׎kaG@6i .0OT,bx\h<(ɩftL_,jH}}QCmu %T&)mX $-uFU<^e$%`,p !bIˋTO7vr=ᩋ)O>ס^{:*7 -Ґ-2@)%깲by>X*-\Ϧ jP\B lQ5qkSfȠG jVz7;}Fq%;$c f<FQ.' lK7#7OwRⷹXt ?0LN)-"ŤbURR"A*8S{G0]Zbchu_0"{]v}!ەsf? P gGmtJ%9+E2TZB["5֣lhKtNj]Zn-!†LNgD(șY ImO)7" $hjI&MC]th*bQ)LD/C5 4 *$ Tz`"K<`B8H"TAVv \q!!A@v$)){u!, Puw0CWmgA89 -Y@}(~^d7x>XI*iv\M7ra%jrVҦj]~P ?qgcVf$o ; ŧxYu^o/7D_|#;`Rj`3W3Y"\%֫ӪòBU$4Y_}(.٭}/'qY̾Tmc]!5޶O8׼,Q[&3o'9Bcx};PEc;)ш~儫iGs6zB28nH-j4Ӎ0J('z)GvgnryX"wv'l~kIGml$F:P(&IK|ca%tY1֓"[qfZ(3y|:ǎF1rȜI {ѫӵR T#Cnϔ/}R˳Xf箯t5C792,/bzd~2͎\5W@-8[&WkqߨJ\g>Ţ2WoG7m6\q\ey F36 aS$c 5*=*h s[@+ŦYAk@y>dҒFU%ãa|<28s(!Ow0vhU~g<ѷ\؞n:^=t/pb ^& 'u,ASc~-ˬ+֛G?W{:#3|woP|ʻ7ɟNj"f' ;}3^|!UAi2yBU#Q)bQ8RVUq#VrV-IfT&yUC)"/. ]ڜ~n/yZޮtxl9VРsVuj.G [tuDafy1.jEwVw~QoiaFVȍ%ZckRψY`O=h%Z@o!w~[~ *&~垼j0xvpǼ;FslWM\UV` SOCR,-#́gĀ9/SQS {boڻvݝOV#iyRa%&2 54rghP'JKsteڲŔi+ΣWH?vc3wk/=о&=C遬N^eձ|8Mޚmլ׼+꽌')v]LqV-*\Lo@shVRKg ZxΙD4vIXn6aLN"(;E5RFqJyx¤q QbbQ4I ֭Phc&ijb NPBY"!(*&g TɎ=e0⽐ݧ*f n=hr:j#"P0j<$"]tȫIS*Syi%FHnEAa9J){)D9:Jh \{ O.87`jb-JѽNQMN͍%00hi4,DJERpEț{1UVZZtHPs${QB!Dq,tJª$R &3rFtΰ3 c,ۺ "k-; prᗓE˯O(d22/_9b(J $AXQ{h޸|  \ԨIE;Elyg#9YMI33mGIL4e1tLj9#M579C]qG_P48zAKLp a.d1$e L@Vp%q r0.AN0T; T!hE{bMK"*1B ꨬ;#ay |)hE5 y[,> 6 +|k~wo˗N{_ 4VҒr)chSf䲃Y?3}T͙:ϲ~rŨ1WH0; u?{8~;TrJzp%\dB51W\&2@WJzpuHL9={;JwT{? rX8LZ{R9R22d" *G\F5dVZHzH6p&䪃 (j;Jg*޹-u+gUTrtмڕ*_"GC(N*I[i4ˮOCk ϐIZyj`ez; Pg+S ֗ W*wjY=T W C//DN_ .?1e9-M'z""'\Ɇ3&(yprς+M&Wr/cm:#\1޶㮅vyNޣ} ]~o {N>l;QEok)1yWSrwqߞa^_7xK'?+zŸ-zOǎ]E^ J9dO^_M\}NrHq ٯ ?}R eg~mɣUM|z!/ii*+\گ~3nhΰaD`>er}W6 TzKWޚ']ypjWR6\#%14Tpipr͂`٬no'$=Y k/\yHtD|pm42S Ln4`ZզvLJ gtNf"\Aypr9̂+Uqm:G\#oDp'.YjyJU{ =Of`pZ\-OjOtaJY2k7\=e-DRO++n\Am|\JWN+L+ȕo[; *}W^qR_jOm_v8IO2KT=y:  QO+,Re3pM+vwjݕ q8}U`1~6Bq{?Pkq*v1,q%i0ɕm/;^Xxo(CK;zV&´ 2 ULSCqV¶d,H y ߢ>9VO%rkϟP)l '֧>G?-^9>TƽWްOq"\Ap+M,\ZkTFWDb? E?|}z fC/!_]C.WFwɘWhuڽrHB`_i/!Խ[y7z=9/odJMtz?&+̔~qۚ7Ǒ|wLVxnPvDž(j*1Op <-jO]}f' _3k4toOUVk(~|[~Cl߿·"=;N#L|݆D-{\2BGHنj31~RwÝ$1!>]Eu=^*;4zw?3PЄ{%B7Fɵ(p l< %IZJ.lS'ӤD. c. M >/0v6lޚwBFjk0 acJ|vybc{hsft;%Yc.TFJ -֡M4ͣd,X\8]Hxi ׷_P,}mQkMl=8)#s7f`N"[D,=$ɌahAPnVrjzQK&~h`fao&vɶE4IqlDG; #Dk㲳 c0$ĐRY)1RUF|5~gR)+Q ϐ@`ċ~"_rnI1j]l ()%Ja>+Cshz\5H%5ۆ6$ 7Ė쌭)gHx1*nZNל28:Ck0)c@F5Z Z9FkBRJHڙF/" B{hyօ"L@X',5/Vs} Y`GT}F_ʙ@K "r;(ڳBAnQ:K+v o :#eOhb:$6y 1r,-xEI<1x-Aa[.P?8w,Q rқCPBӮě%@m!ty![8$rH,&e=* qk{&+#`[d7ReB0H\bHJǍـl(R0\j d{{C(C]1WH b-PNȆeBGkiB+1VQJ껱*pLj`FBUgwXK 2q=uipB]Bbb=9ýJ0p(SP' =Xp@ ".ڗ`3puk).xb΂# ݄ctK1;ZE*)>;3DsXǟ p`d#\+tRE,8J8I9X]`ڳl"-Rl eZ\t}#S]TIhQ]LI1>Qed0+=1uDr"Y[ZDA Eb3l, XhcDU+cF= &623S^z/oݩł4͈^1,HNϧc*Kt 3%aw&WЯwzYs]|wBӨ7uH0b6# G 3P.|'pMȦW%s"ajmLZmZU l`VPda>⣊E ]!0ʃJ$JEU28ǽ#`4a1F^(^R JW\eM6%[Hw 3x/EUda:\?3yM bΛ.ێjʨzeA4DaN`O7۫+|}y'k7TM#W`> _{`#Cˈ|wic.~t:EjCF]%(% g0B5ژ n b٣f _`nĜQ2vQ uPt  V(U_c+pIJ$g$+@Ak(Qa8J^l% č^cxX,C03-5dXT{PZjw7jޜXT.ՆT0٠|,-⬰)M9QWhU7`A+GYtgWOMD,Ѐʬf5vx6@)zL4F;ja !NGûQ"Y1[o^SDa@!#x`A3GφjM=g߅R𷳩#bjѭ:ךX7jFq-R 2j31]koIv+ `l1123_f=ԫ-le{Oۢ-J2 R?Ou:unun(a;?} 3w8tJHA^XiArHFsȇ醇ZVkHkQ.#` R DFi 'O rk@zoՐwdX#2O-`qE bʅsE+fk"; C^'yM\ة %1<*!d'ZCoOps5ҨF"BI^1_7/8~q뚄O-756+ !yzF%hm IXw=)j2\\hx1=?`6*]W2olː'wkæZ!k5 z .כ/ʖC ^m7Npizb-| !.Px7f tXz==,{3Πr?;ž2r3,O KΦXl8,L˅nl 8 xMpu:4sB wh ~/x_`^lښ WzwnR=v4p9 #1wpE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"Ճ WVEdX($1\Y WEc4\ApE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"c W؊z WWV+D1d:BÕ2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE5\?k2\ \vPJ\5+@&d:FÕy2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pEv0{'pE/ֺs1M·w*%O^zFzyM0lx=0X50Еfۃ? P Oc}(Ij6bK5{o "#?v1x9!n^)ܨ7A¡/Obj!p/gi5j<`|PuKWY0+kf1/zKk1.,ob[/i <D2HqSjNKq ~)hύ&@5$G^˫e3/uj)Fo3p|b҂@mp5ˢh Cv}.>$CUۓV5]ҷ'ϵ`ׇj/#?mk3n[do/_rs]G#^٬c[|[ZJYnM SE c'_Ŝ}}u='!H`nZХTJ75z>pNWn̎]]i!AS틮tϬ:/#JMcHWF alEte•ը+Db,LGG,w[%PnГ1x*=p0]< Z+_y7fCh9aU 8ALJ Ҙh+髡iu-4h{4(%>BvrWDWXW ]!\k+@k=tBT1ҕ7`m5tpe5OD:l^2{'vϭv3a7N>]#]i6=7 ;)+t-th;tBN]!] R>U`=»x:t^f Kv4|Ojfa{zU[/hxXu.cq:*&Pͻ޼{PŤ6`P2k>f/WZ!6w |)ƦE4JxUAlP#oQ#?J#!W-Ȿe |lThmmNB)uVQe1ZYEkUۨ|uНeJD[F|k(ANW \okh3D%RFs<U5_gB-P-'+e7"RV3i+yh>#?V4;•}DkXEtu==UyezsDM4+RWj's#i't5Vj Ѫ+Di nEWZ >҇4~wlz{ʄNzn7Np~f{.h=ϤvBAWvDWmz.!f7WCWWCWJItut%&Bj [ ]!Zt(%DWHWAj>'SFZs_]ک2Njeh ad sR;5XD|ڥexI-[v}@ds륜u#eeQWK7S|5,m;8iWW tۓ}D,u\ rPyki\*QJ&rT\K^m68}3ǂ>_,w}[W@P/o_&WeejϛqX}e:ó{=p% nׁ`Y7ף6nhX]Ae 44Y"CޗFL &%RbR 5f+JkM * ]\Ѭ߫p+2|1l/LK8]65)pÿMσЦLLR;.pmqJF瓊"\m,l`P`?5 9N,?Xlιn[\;]_;j4kaqҿϾ+是ë7f]7av* R ҝjٝX5r;]W o)w}Q:Hk/:h|k~A'iy[uJXk 1`,t(A: bDc6[烗1ck9o.glsls];H U,YLmE$DE@-LQCNSUH 3"seRLn0UMCax:{fu w zU96=9_SYR"h֊V˨mEi8g)h+TVh ɿȿa0k o0 &vn:+(,{)1JsBNϿ[16aY]-fy.! 皿{;hNASzۓC[fL)8ɤ~o1ipJ<ʝ.'Ib3e4nNWׂe}ei(|6B@qaO}o<ӆKΦXl8, FA{>jzЌs5[LRޔY: PîY$Z?ssPphWLAG0jsF-0Oξh1SY9uK\*F7@A&$>?O2˰\@34erv3hw'}8 s ]Ɣнs=fiA( Hm+U%ZVx޷/t'Zsi0Z{5$HM3_ cJo)z sY(u$E.Y E͗t@99n!:4~Um?ǟ-x 6-iRf̍]zr;;)aݭϝ݅R )-[lK. Uf?\!VmE_ GIkgףf#s?hh4P+ vk]i 1dܘu*ya?G| b}Z8D`v޻uC7{_۳eYnD Wv[k Sf3ey%(zVVVrs K%L cA,efs!Y g'1r2x J6^ ޮL!/g=~~pl)[ [CF~?\#N]K#vVȖ+)N%\ [S.rW,'WB(c""g \ %KX,+->j8G)>;K ܪV.3VNx䡥lK@FUI6(31 b3sv{i8Mz=%οUwXuG|fqoTjt>pLLpQ?{6rJҼzrcsle/&u?\\yH<ӤW @*@7_ Y`XN NYf,*24)W̤i#x-xfEzڐGs!h N2 tV).x"Dtx3yyzAB}?3@&8}afE*dPBȺ&a1V;1b(ۍWFYRݕA 6G-q>A::&Ct\ِ"-ynL,j|}QC=m qP`Y5)Go$: d4mH΃`#SX""1W5..zJog O}TOi+oG^xXuUe֯#}T̟4dKP4'+[u+V?(lU->>2dRBR|e35#HT0nȠ>D=c3i~plzb*h\(.1 R:%\s1 nM4lB$3Q3<0"hlEwwsDvNtX͍VLY6Kykrv%\0edf12$1҈(%K<ԥ-M4bqk3EQ[̊Fʥ.lIce’G",M Oߪso:}6v-۫2+wd(FarT pl]id 4=)j Eۭ£cqi8eˡfe)_yKbwd'!hYLr zZ{Te]Yy"EhvlM KBs'1:! cC"KkL$"q88(RWJ^M$X*BcXh#t a!bbAf 6CWgIvd>Soouq㌧uw'qraTfbꕎ.4?Ct9//57z`NC xmO^7/ 7L1!/9嫏8ƴfa)o/g׫I{&pxTjNkF407*\'2jZܚrc.3Α{zJS9i =?kd8<ϴR4?]>\D OFx9߳3q>ŸXPɜM/Ыd})~N%c8ݘK&N.9 eл(rK~Fi?-;ǟJZg27O"G?68qZJD b#ř&:FlCL6&I.E"QYb\G"d -tDVcn夣cuḎnljYGV&W::lwWi_2zgCk_-7ŖigFVHGdz9} \__W$gk}g 9iaC(fPy,ӶRdTG9~BiH:@lY^EHHQ% 8Q`}K׫_MMg[Le<:%q iwt`u^`ݥפg(=P픮':vLz o-ҷ6얢ᵏ6a Q=cjBOU>-4E"PWM] ؁ҕ*UJa\YN2@S;ڄepGWvQ)veJkdҸ`L&BbT..$4Ig#PQCh=,,r2CpRr&s"6A=Mg`!PM§ۨ@EL1 $9%!-Egd ̞xF'J6פ3SB/")س6튊MjN祣?D\&'_zb~$Ft/~}!\4q.a<rkfsϏ)EzE훜/5 A0X,qa9^kahUF[CC`Mږ䂠L.1ٜ74sglGJoX؛d싅3 m݅kqUq`ܞsV_q1O&q%\k,+&F5A)F1j Dݴ"2+z I0&]*jJF8Ld1PL`#f{viebMP_q, j !Sc"}SZB`@zY0P gHZ1X,db\%š,jT#4}iX$=aoَQ9+x,؛|싈gD"G% SdAќ@jJIP{C=}YW~w'O46e#Tv ]!!Y!-$ڢ(y\1c0e}Mw,:mov q*F&ެt|( ت.zp&^ER* *_2YC&{jget 1edˉYNLR8#"!.Jy Y>J牥d@vsYqUJA,6fhd `oN1Uܓ3u҃7^$훰+G82rx?ᒐwӀbii<D_]f2{;dZ{F_!e1]`M3agA=mml#ɉFdٱlɦ-`&)VSnrw+QLz2g~[U-ZU"R L rN>e u`&IRX\H̄rX6kvu[el7%;i>Rl&Y)UAFᄌ>pJFpKXn:)8K9/)wg, i=7&[}DZ8R.p.n! S7ٝ3mw.`꤆"׫;ҫ\*|B0ɋ4o-ЬJd4 (x}LҶɷӎR3#N Ji.Yȓú(\Hi2Z ASI%T$PzKegHRxpSHCۘA26_Y0 3 _4-)iO*$Iq ~ضa|f "xa,HbF O,-g]oYn f2 ӗgr!hqp: aɐ)%`f"3ΎHʰdvKSPȉik5aY0L]FMJ(L7lg*&3c;[ |@)=z*3`9A2w Ab ֛@&9Qo;TDgFIFs$G1%'*l9DxNW Ͳ"ZIpct'~kwxaN <CDNT̬J㶾'zk7P0$&<☦y[~46&5㡛cV% xi:FB!N 8Ih*)'6A%IǕ 5E]LR'CG&xcCg6E&R<,JP+%s6#Sq5̂@K z=08zhdG&K&G|un8LV xg;?`|`PzPǗ`׃p/nEW1KƓyroO0*F ~᧷C+Ӣ]8z%@*.*dp@RE+;fkpG"6Gu4<.TGT Ix>m:/z=d}z΄_Z NȞԁ0gvKӯ+}Qԕ|-~`QE]LFӚ|77-*2^eoN썇+̬KgI1W>]/ɶơ;0 ')|*ZABQ&秹8:2JI, 0'e~#6%*x<ܒ4߲Pe8;).F·Qz,QIOjZvѐͻa{39z8Xbf I1Vzcb]烮9iܴe\$W3I_j@~rF ʡP S%W<t8u}7>_Fk\Bmﮏ[+;o}q`1 Jn,J)_LjK;Ͱ+h`w*T`iV^F˚؏Rr})Uu1Xr L먗nH˧+oWzuʺ{ :=tLtNڑ҆Xָ:s?]bq=t6'=mbg,RxsX An o,(E0'-O[D(?hE~* ̓'V&YŲJa#0Fʒ2ýqx*f3)H疟g'' ?aX_壼.>z#i-/>MK-7RJ9/cO@zy^^T^B4x4H&MwWM[B 5aN90rJ|FniR'dfoM)zAyo2P~XjX^6gʼn_&_@5[6GCcϿhN%UE^TL&,U(=UgͫϪ:n^]WjsBOfz0![.t~,]֥^hzO2gӹmVBJѺywj*|mԼR [ܽ{͛(oj[=?_-H>)Z7Z}\4Qt]W ee?|ϱ;p0j`ԧiw)\Ɠ* TPZ+n+?:"qtTVaM8ϛvw4n*m"nYF4ߌ/NfqߟGtP]KaV|א'sc8NR|LJ`(A}@ -S3@T8FKCC;oG+~J1;(ޏѮ'upt%\U4nS%@*)Z?9GC_ Φ?jXEsnm7)[P6xz,rsr+b׻UCnv(EĚB^=멽t5 ]!\#BWVvBttĵ`OnDťNRTN}!R*` MO1Z>OӢM`‹Ke=J#9iI?aàrhYx? #1Ƶ.iɣ2JױZ2ÜL 'jN~D'?i,sF:Z&*r)BK`^'^ִ7'6kYALJι/'k] 1XKG)3h.D+e=Dhǣ ;DWX|/FuiNYOW{HWMI |/F3ߋ!t(i1w%++ ]!ZfNWR鞮4|kEVL9-ƒ[6՝i˻4hOӈRОvݡ+dgT%]=tebfte+ ]!\BK w_] Y r@3t9[5uCkn-QvdstLZc;DW)"xg JB=]!]D%]!`c:CWWB;rDyOWHWuՖjZMc>Mo9cTvhD;_'.'+''?p\ ;DW9r6EWWU;=]!]In6+l ]!\љ_'>ҕq[3tp9e]+DJ<$=]!]inT (?숰n{6xmhҝ5/:#ZINӈW{Iӆrt|\ s8NOR,dnVpevzpM8No||b/g}0bK+E4ªq<X!P12슆0TSGcs}ŭk# ᖂ}_goо^ث篣|jޙo[[>NKMwQs)V#*]YFW\0T]pmwW8 ` irFYOGlN?guu0V?`4ciB̕QSȄϺ櫚O}E<>L>Q]]Zܺ>,ܘ]o."-ᴷaA3۰s|?f=koɑe`܇ `$䲸\ֈ78 YEr9̑(j(x4S]]U]U]5|!>rNr+X3yM}Z5W~(MؐP+0n>ʧbI{6 ks9&!) sltNe V]YyisvLrnS7oʁ)҃p1D: "KMo8FҎY | \KLR5lѾ2:(ҺTJzMmlKՇ QQ\7n-}.'xT|p|6 :=x }M6a2r9l.0uj[3ѼhrOꢪ|z}^At:p1]wk KCE-H[0klҼSTii "CNX B8/*b)7{nF%:fy*+൞YNY1@Q"hV<v LdX$X{b=YR=ě0K7/qü TX|~krJY1gT6==M {4M'PwGJI,S˥ Y c y&R_'߭-r0D@`3WE81M#ƕpc{P:lNE1lg׳qGv_ijiS3`m~WvڲSDu/ӧެ ~jùj3Mbƕ7::σ686*+w'O>a-V}g%kOҀfb1+0gC-MN%txQ-&4عFnp‘n>`cAFV,܅Evc#a9"+7&THa<1cI_Q `KGl$,EMr,p!U|pAΕe`>Jr/A9Z1F.۠L gI1i?N7 ?5s';?Gxɾl>㐎uHj ,.:k=0턊L9+`[cR;ĬW逰  HԔa0+Cʘtzqgy8rS:!ݘt?#[l@EhϠQ[۴Φ/~?_G]w#4b& A(oX""}$ w@ y#(D8DH\q/z 8P~rC(H# ؈`KZ{EuKp3kSyKI -|g%0lX0t.4m/vKfZr%)@wjJpZ< &q{;F6;x`PLno6P|傇Y)-ZP +Q{5.cHGn\.&<|q{*zzI YPÕq"2-XIRm K#pKN$OR׍'،E.1xes}@ X) ' = A1`} %p#$)ꃜgYUWBqHcH=I)ڲidGRB<1ps"/*,z1}^ ȭ` A)Hr =p 8A!/V 3E j?γyČᷟ&*XuK3*Bkt}2oV-U^Ň]]^١r]xG(4tc߇ŴݖH@V& u (zU 0k󬺕WCA_̌r .ςVT<ܦXm n(فE+pm ;lsf l[MYqg\խ5>!$ϬhM2CoDָ|cFP=%Є6ϑ. ?v}47e?b'ү-r'}KtVgc3 d `]dPt*O73+eaϰhf3AHr5OfHK)Fqp΢(10hs+$*BS$ q}1O_A R7/:׏/g/D*$R ,!P+0b9/$nH :7` 1!{{|ک+;a]_vՈavzj[*jʋ%,j+en髨} U䐓S囨flj !uJ'I2Jc潎F*k12UQ[IӾԸyȸeY %l:O5(qk-ʧ#)yu*XQ` EC*xḄ<[1tA`e۶kZ`yGDIEdyES9%1Tǭ“@Xm sy}_j7όh0=!p#UH`8%rA"JaD*,H$",={Fꖑ \ "y(Tؘjc1BX@1RK#Ձ\!p{< rH#2]2߈Er&E򹅷Y1/FYM{9G5XABl1JMOmJ@*?㫮濇 ӂײ׷*ܕi׳s b ̙K>=k`qj`S[ٴzi"lR Y _O]ƾ<[N[3^."ޖՁB ;Mm))54y4/?+x89<1:bQWp5 3?2lf?C|^Ip9B }I~"AdgXn`)"g3tvԝv|wV,`> c@0iMњ\:$U؍.w0#&IJɢ$ ,WDDD3Sner5mRp/E䝠>ØP +3%`y_TluN= 1 /umf@@@ ?/A/AH4N5rg6Rp\ V E vMʰRǿ|(+a Ye.gDGm߸a[00@X6i+@`&D9)uށ DP%zBVJ x>AL$@Ǒ XO5q4T~g4qj%=u^:0X׉%KTZΧןm!rFBB <] 澏KnIՈ8| f#:~x'Q!hY a/m#I(N.af u뗎-կZ65ȁJmJڈ*jXLϼ0o>#;Mj lrߴ0x,Uy~7dp5^7=ud~TdH[pyy=iub}XZAM7U(tjMϦ0t~,ɆAv,:zz\+-tV޻̛aey[S, v; ]Z76V6x$mVV;A)ji0 .~W& AxMjf5|X'E%|A_ U_u<" ne@t9_ ~3)u5n2}Vy9˰~Z}ؐ( l풆 b=E,:'%?yi[jm>Lo'-GL ]q3.nwiy)X{ \= KYR ZL\t:[Ɓ]E*m{eJTU^r+ŒK+(E ށ*ra+#IWOC `x<d!6D5Fݑ!ɢXHPUQU/_dȱ#sfFXFjwU?&,\)1Sbe`Lz6*CDs_ &U|vCr6X9')@%TEmL4e(XRF3q F CBv.^GmTF1y%}"@8t2&e+)قd&թ]`۶|fCHP8T1XQ;&FbcCm&~C}nƴ4jG0bWCl'/Nt1#/|}{N.u`LK?R_yV-۩ߞ)E!ӆiY90ƊHSK_kLڊ-1β}"QWRR1C'BN6'#R" qfXL3 a£rtZ\Vf% gy_ܟs6|_^Ѕ?&w`JNb ]4 AQ=s?ٸhcflHiUa* *Ϝ};|Q6%"u8Χ醋9V1GKe{bEBQ*%S*I:Dd& +$ &M񰠱!C !33^t2֤,QY)2q5M_TgBc*vS4.6E)QXmC2R ^$x@qp`P%!'%f*xq~6W^{y<'UB-e")%BgF+8;}KAFK@vB)^ h{KR@p -Dч貽0Ի%rkR"u;k-\RX\!A9 -!r <6R>خĽ })0 Sq"tVB-T&PIchcYuj Xt>|bиr9P( Vf o'= p6KbrNwuHuov{c&#ܠx(;k5)S;k0M9,\.dB${3 %E`" CN 趑׈RgYJA$/ Mb0*%IEYQXKOА >3m7Y3/7aO_\,~-tJ`G곋QYњSkK|1zۿb#y*I.* wrUh~(wT nS hS AC-d* `e'IDFbR묒?ڪ-d^ 'jRrp$$1(]Hf&p&@0)au(a+qS.36-Fcq?wz8{q,ǏyY.p-nM -+}y9%gLJWneQÛ$fW5*Ӿn!}%# Vu`u;% t&co8*Fߚ{.Y^EL RWPDSlFb^5 ̡n2 qFl4Zc2 Bef$/Wg(%b gdelLJ1}q^=g}#EDW,K1Js"EH.UZEW@}N?KHT[9'tQ^KpL& jaÆBJ}ajVH#jF❤Ozm}91,Ey㍭i})F-$%\b.wN38;m]qwB KLzIXV<ڙdž I3 yuh:V㽲EyXc8^F/Ku5^~XO$j: a 6y`d48^^XS~ē8zne|X"tYiE@yET#vK]Wg& n%>go9]OD> dє-(+t$$m*#[ѳyJ.7ћwO#z.JJU?¦{}/1*yx>K UP?2d^|ab‡#:]=J`/,?N\xO}+ {̟Y\Φ%/8OL>\B°w]e~ kFԋgj.˛{7FY|EݗU|MNkMo:R>ݾW6z^`)VG#=;0d;:ZL{)Sn+K9E$ɑԩvy;d/֑''A'1L XIxlF"#F.(BrR0 U&ֶjE:Eh(Pt^28.jdKDB(w]bL+(Ď/W z@Do/f+!f;ЋЭ>ԅ՜ԵK~=KYadI6QcU1X[d(}!rh"l4XeT^0d(@mr ;c,A!1|!vʕJa4:Q(Z5g˳;/ʶ͹k_6о˦ѳ|& 'ms2M˺5VRe{T̮ }F(]Pud9`ѣ8z8PQdEbvgY| ,i  1E$+06y)|[pͷwu/8'(ob>Ŕ}FON ߁IuєtuތrC}ۧ|r>Rn=fjCm&VF ~^Ha4s\cu蔍`Tw˦fl('9sNRSK,skFiI2 J t3q~YjПs9bjo3ոt`jnEwBLko')saD`s~1ᙼRyZ^糸k4=gYaŇYHwVD#c3ZtV0"T[r>Jcl'2CydX '*XsfȩJ&]}|qt˞<7fs 4^^_iX,#S.D 3/,ߤe,Wfk1&k)z!u8S}ARt矧 './tO9M*\RWrqk%ov۔:̟IN+;]̮L-II}׋mz|FG}7aKM"f?}޲mAaK[:OP!w6}2L"Tkuԏw:XW_׫ջ9ts];f? _J[vfʢ9`Wև}ݑ |3V{%3jC:}z\Cn?,_ݾ)WK+O!D~ZѸLJ{Z \U5T4X.% vbc3ڀ2d!GYt=V|=8)܉|+/<_­>Y}{,UvMWv"n_?M6'4 m! +ct*^(i๽PN:D>ZHyt|Wi˲qYvхy&^P*7/LŔҪR2= _brV߻k?i/(PUi{9N 47,R>};= uK]@>1/Voo ?+ZH~[d~5N'ժݍfzÚ<6~K6hǗ%?i6ok> JQ|WQE6Gep|5ouqС/ȣa͍8Ch_GZ]mo#7+B>/ŷ{ /$ %dp-ɖlܖvkA^&jTfbWգ-"7Hͥ<>Ea]u?wE}wWEK>EwUB2]1<wUU\UV]-ޢRhwN6F*:q.]-A +M:{FgtvUԵp.uo㋖B +[sx-JmrUM?},U'Jg8^nD%`{%393F-fP988q{))εQ‛o>o. I"u;3+% foMP֜&屋4mc4eYtVRQvMcI]^Mu 6.vZ\w$'~p_ TY?}f?= Wmy:jZymDҕ+OW, 1Ƥ䟥LEh.f-Z1'34)ovZ=e1Hf.xZ|YˍgelŊ< Q*@7'_ ls]?q\t# e:̤@%9*9pn蒕\KȆ-&!Je2d)q!J0@"o)XR%I(gyF:i>,?2tvLT,j!Lytڦ Da2pyOP§xlҏPں&eۮXy]B6I %X#2+g'zЀvU/ 6'S =0V2dː# ;% \Ck`"ڈzdcgiWi[{~TMh9 _8Hͺ}}([y1T]vjXrxXS(d8l}ri)?'Ϭ׵QQ`L){e92)AUţc$$C9vBlz 3qJG'^-ie[X]X<57 `oy-cQyQm:'e-?ZƢ-huΝ,uֶؓ6ʆ "C9co!|΂ݻԃyG{I: Z\bfA9̢1[Nn0 lx4/B>48ho2l>ܟׄͅ" [JHɬ@V}LӚԱlS ߍFߦ\{ACڦG~66}6)e8rCL:ѻ=zY_;@D8luCǥ d% ۣ;oX<.W7|I69. 1B'ʑsRa >;.,)1@`~ pLf4o+"K){]W%N!%rB7Zu^u= OVŌ+/H'!zǞF\fb"&]jd4묕/Q L-!3՞F9b˪6iyuy OZַ=1_ bju`/6x9>j{/Oԥ{xƛKOI,,r2`!8 d}SA t3uZf{NmiB< u83!ђ3,dc&z&+-O9xsxlόglKxg/F?}]&ky6ӦS\\,qAe EٜjJOpUglO??.<_A5*pUVKV9^3_ zO'iqx ~!0fa-3m$5\Â}ԩu! RCSɉ]/8үI'F;u0Wo\ :$%~ |\Ϊډ'I;B6:߂}Zjރ6w \-z: 37&0qZ|M=_.IrKݿBͺi}eJWů<\c޸$^޸wnVjͳ澄5EY2Dled4j=xBo?{Y 6c T3z(o ms5O]φ[7/n FάYoWqX,Ǽi[=iK]{9pJ'mb8\'mgsVqhZ;8yc 6 T,UXW\jGF 'oqf3\Zϯ+jtr־)+>o3}yjgKe4H[W7 m"K琯Wӕ6kuΆ A{nr @'BE^ʬtho!5.aOr˹+CCMr !+0V2ϨNQرiCfNVʾ)Zj6fZ^c(ąeN+`DZ٪pUg`\l`%:}K)ӟtyDsyB2kReo\YL Dti}o"zsL+&1)cN#8MPk#9)cpX|h7miԘB {7~yXPJ/3H3۬E:`IV с( >d]~,{vX`g@3D1~4EsdY323 C.#,ȱDݢCa1N 9q %0A8a1IMeelS2Vhv֙:-}23ϓOH'*] R32.' =HUe /zXZ #d,|0.M[YGO瀁GšDrt,J2#LMӐ1ӎ!5AP3W!NdOǫO<]q( r:fӦcq48E’&f KADZLO'W-(lMe7tuǔJ FmiDƄ1wRi1@ArA1[ R8X[w'"ۧE\be_%r񹎍o}Ն6!H R'BSJraRP˜eBd!D70 c7{}Vȉ!Y![<-݅a\GCzMEއϭDɗ&$ qr#sl" TqZ-AbCx " $UVx?|#K뵔X/ϗk J҃Jwtxa)ؔRR+9sA[M&_uLdTm@UQM7&46v*5–O\[) -9݃;32eQr=*:.A;mQE zרHT,]tK6jS]oJ}N;[dm,i=WqO翍kB2Fs)(Gz9RaОdE)& fc26b_rl eZ_ގcӢ&Jd8 AɤCmdExKJ׭cL(s`BqJAі  a"{>vҙ:-hyEsHe̼"wЈ;6kT&=oX|&~ռgvO\boղeW!nޛC7'^9 mq pHAxM&錩ȾNT6EeJ*(hɍ^NE o/j?&% pԕ̾-.ɚjP'.6OZKX|/)Pj,h,fn٩ma\S| o())%m;UOt,xT*,}JP&a39(Hrl QhcB &" fc>gH|m~p9=y 5BgdNoo^$]LM{ 7ܹY1b6>0L'i e…(T U6rp&\$"h^SIw@EʢSJx6TۍwFë5aw RctE5&u45b |óRp/龶:ȕxȅ^{u奬 XImѐe2"޹?!$Q *HAw jMYgdH=9vas<[SVL2 >z4EcMYɓm6jSm8wbG(Ǝ 6]k!aeb1R(˯4/9*RAYFRNQ0ҨR萢y3Y/4,ea$z1:eRad:$rηoo>kN3W_ }Go32;klD,ZD6aV% d]n&i oeu7Ѱ#} s`sޱr෣}Nܫ7<#ӼlZV ^m@9M_i ^6YBi  ) U4:/1tv?Ҁ9Oi.;fKȝkGl ;QcL1$L ;up [{T sҠv.i'Q۷"> ??~ηd4(dzQW49& >._yMH >L~r}[~96?;.v8Wo'_^n{`}Rݮkd9Wb;kRbϛ;wSSg;QH^j|Z*字gLg;w`C0PslDUXFKu2"WzˈGD? rI`Lں`mT6Jd+]HP6 _b5 T6gHP rnsC>'_*"%gdXa06F: #Bvs0ҽXL?FTF19%]""jRd kT4ވN S™ π€4FY_U1CIbFML0b1]Dm抭|U{6/"+HW7gdժf{~:hp6y'ߴgj[תc;ϟatLkKn49xF2Ԓ,CK[䍏ڊL-1dy~Dݠٖ'G6T9Q)mNFDAkdL;vb!v0bw'[\Wf# gaktfW?N/No10`J$) (S./l(Z +;ElM?jAiUa/ *ǜm;A(plsRLjݙ8O#v-/ݙvqD>2[bEvBTr ŘR Lb'#E$ &mՖ%P[QgXFeS#cY\L"QagZ`kų3 C $srd: w9@fžag< l7E>o'pCeE 1X}KQQ:gGM(մkj8&ۄD)JD?6IB՗eM|ZI|*I-%|uPNc)-BJx he] %ߦDrzqzcXEgR`uFK%&]@M GD:RNŦڐlF(R9K1_|KM+{h+ڤ2L%l[NxB6 +=O\7y}t^|\dʧy9.c':)'V{cӪ$xYAo&a9S6{ u]RňL 8ؒo-IxE<]HtcH2I+%eUtɥZ>hV"M_ۉtcO-xX)DAd}˴Xjpe#1GL0t:؀Vv-]p7{(TM9空4nټ:wyf~v:N[IUgZnO{usX&O\-|Y:Z@Kgݜ//uAPWngi%ߴ騋;eib1(nn",ˋUogI7fήkxo *\Mҩ4UckKʌݓ*+o[K='?Y->2yF&7ߚ1p/p2Ig~:YOOj,EM&fkI/PCC t_E_o^GثN7ZKqtBK|hE(O]l9JtA(D>4QXq ?ȳJ~(gV)_ēȷ|ۅ=ǿؓǔx1WMz<y$V܇IKT9]g\'[;^eKVZ*z\XA}Śl&1o7?οXTm߻ު2z'gF^g>]7?ƍۤQvW"nu+:Nܐ`l:6,42h礙ץH5䌘HM4Ħig=6,[ٚBFfa|f+QxT(ڀ7Ývl8yq}ٛ|toH?mz\0O,(^N7׋Ҿj hMLvq|P{]!J2`.fyW]H$ϙw;JޖVKR*Ɵ[iiz81M_./UTúLjlVE[ POEPJVAl03F6R2d P%B,rN !K{%BRoU+~J i\U \Uq \NURS?"\@CZoF ]Uq \UiWm!;+#j%  WUZp}*G++ȸ[oggi4&w3WܬB4N&o9[+jx \)-6*0 X\b(pUնpUD;G+RWN/Y-3:/m*2Izh@fqu9-y}m u_|Ӽh~ON=n? H3h^Asnj\(cMQY5EjEh"kF|jFˏ|~1'jVZUn@9M_鋴 =`6%Cd 6c-QIc+cԝd3:Fk&(/)ÁIkC^`,*uBݢR_-f$䀩A4NWpUuqNUri8ǁ+yH < XZ'{_rWPK1 p1*r}*%pjksO Hհ*m7J!IټlnF?4aK#YN<@}J>$D,W.kp K*س_ M/6(ٽDxC 8,7(hѯ w$tr*:fkDWyz1tVR h~'Pu+z+碫G83գ>?1(:-"f/8 ):] BWZPco&t/+CݹUBܺ=Z@ƈhyD@Lċɟ엒?6ϟ[CF ̟ݡ?~`Y\7{|[~teW ]@"k?׋((I ]'RwS⿻(Y_ӳb0Ѿڿ=a"m˛ Ws}l:ݞAoD^GdckB`K:S KélS6;kw1/ޠql}\=ݳ3ytķ|s&c?Ł?70LæwWzz?} wCNi!վ5x=R;I@ɪwώn~<>U -; 213p@O }yy}f/ 5' -z#ϡǿ.}F%K HWm]כr~Oz[2z(ZBojMg˺梓%*?yڟ>l ~GVߤ@m.ק| >Uԝkz crT,StRΒv('}GS&U5fх:kUʲ28rTk UT';:[;cH;i=u˥V򭫊Pcn{'7dRWYZD3v s&U͵`0&jڠ5m:Me$Ztm1Zzl ۷!bwJ)Q3U]dlrStMSjJ1'Zj{@,} BeɌaưF4CcvW8ol.HSZPW㹯fG>V'G@ۘR:Wvm7TSE4P&sPafCCȥa1M{'6+ oTh}GJGQ"_ @#.HO$Ƕ8K*6Yd3<%cNC 7>|j9 AYUW-wbSSI{4xN5thncP$ף(]w_KaN>,QM[tjRhѕ8:AOkc0m€ȣr Vq.`5 j%}IX(pֺƵ6QUYYbOA!7S46[_c 37% >{F)NutAڳZnB BQw_Kͺ#oG0L#mTȗ*)GS |FnQxi v**6(:ݠ-޳< ;vl۴PTJVTbq:_@2yl).[]:A8քF.y ,&NCqk 9:vXWOWgj˶ E@PjGR7fٰ֞wPѦ J(ڑkj o (dJ)O-` tW jX9M ̆2!I6`,v QAQ)tJ%Ce VHPcd2 V켊޶X uszc7SzCZȸALAAX' @zNB€("2] IsDUtFtQl<t,,xg`&b7þLF8*)ԙ@}B:|CAΛ:2]3 pQ AP{S e*3;(@Hq`Q <ڳ<;"JPAw/uz62 RoUWWaj{VQR@}J4nD32-ƷzK{tʚ9&dYk8"2DbQ{4l {x.ZQCk>0gM23{r_bF\X1,4ߨ(Pl^!0F!N%}aÀt&XwWv˚tq:n*׵*k mCka&a-A7Ay@yr*}tcU2){H,QUF2bx蘊'0#'iΣYΈ fByjN$\Pd"UB5l\,0#i1FXi(^Bd >;,Fv ݚZSP'ՓFCoׄ bL%vz4\}YeFĪXAIEsE6G=Tʍ*n8NJdt `%(H HČ,@z|!(ΐU߼Yon1V XW kMkׄ:HƓa~?"oQ^1 bp cFYTPcD9QgL 7< U @*c6*R U 6ƀ5ml=[[0si6@:Xf=ش=CɗML׉ d, U.#TK23'еe] vKi\xs9z-zC*mP zxq V븉a2ي¥f@ 2q]HfhJ7a#U2'YkO^ (TcJ ]\1d@b~pQi8 ֻ\,*w-8 b!C1+Pܨ" %;)jI-, kO]qu"BPQ 9r`= R V}s7o6VްFICkgIތ(N~ݝ2NudRk_4/77~co>>bo[mJyw r:l.l7͏}^]ǃimW7c~{|g{s'n/N^ƿRO"D1nmݏ{.O=YT'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qX' jAN &, 4[h؝@eTzN 2 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zANO[/no;[MGw .իUټ;~ yú% d\1% KH2ĸK =S [Gx,Rj=~JXҕWLuGpiu-QBy= #fOV#~wwBוr'Ξ@}ŸwvRz`WлU%sۑK: bhzehhzJ+Et--gCWk?j)t^c;^"]EV/i. .\K6͆~7t8ⱕq~fz\|qhqѕ{]9/4GEj1t5Z ] R tʐ6@mOvnWWϻ،M?<|w^_nv8V Zmxwu'7Wrvs<"Ic)qem!Շ2^Ӌ <:löCTNC+jٻ6dW~"]b $6~JHCɑjQ58av:]U]{{IfRp}!Uu110!S2bZ» SbO%WjO9u@חZ #9iItBFt s!Br+D{t({ztTvV+k;RFv+D)HOW4%rc[؀# N'@w.{Dy4;4p Mk<s<%s<_#M N]`Yg *ҙtD+ԬWHW ͻG'݉U'f?#(uo]Еz`׫cbV/o]= /m]=zagQh-!/CWC,C=TOWv=#Tk!•+th_jGBOW_WO@*gikZ(ɂz~w_U'fy 1b 7!B\ڧhA@O"!6t(ȳEv+GWD& Qrܐ~= F.9)X8)WЕC>tBF2:DWB3th=tBt JQ!]`i3cpK/9~Zv!`D)tOW4%l9-Ac%#ֻnCM 034p MgW4(yF6!:DWµ+ths(Q,f,O /bY,7l]L oөgfBHY4Teu0 kP >[&c r.R_5XH/l>ӻ7&*n_ovCz!q}_+P3}cpon"%A!41ϸ!@ham0*LNAlpɊ$yQ62#صWb7G@n*G3BI5MJ3W_%Śݤ ^mZMg_zJN8ÇTK:YjE|2yu}xij1?o_ޣ_[wW+̈=IJMAkJԔb6%5T(\[u`:  椩QzIL:5 @6+bs|۹H--].0/.1)@_SǤ\JJffڔPIVgs.{g+홺f/ ~?d8uQ *wRvSb%RY+3X+?[^ᨾz>%uua.UM?M; *kʩ3oXh9!|\S0ne9f5z9ŋ7|RY] 9LSTU?u힊Kr-6L0$Kbr,h1.1*/#0 1eEm N0^P,f~u6<́^ ؔlJZotjR;[;\ݻ$tgZMGYͽpL `61QրB$G}8xG\$#5aѠ_{))I#uֳR49 STTqG.I͌۳UZ$c_.-sYAm[e<\b2,'_?Pdy2.-T)HEC*4`=ރߌL-}2`)M[elnxs= i Jx%fl&툱 .!ei[s?c4nڭIǡY`xl<1IޤR ;ג fhTo$!eHxс' \ %#p\pWFNڹ58a.b1 P5ؗUˌzF.ԛrg43PX\j7؉i銍,,?G^=)J*+Z~8jp+^4ࠖS4['f^"`d馟?> 7=rc.t&}] ށdAADcj1Ϙ rRnn8_n. ~Ď6/ U%hj˴\ZXtT;.US0#A~B@ԭg?U(--D&=ZqVlIadԅ&*Nz|Δ@f;%XJ ;=Om~2#v'gq`.l)|>-gD q!d&(phn%. ZSe ^ 3H5!`0*mbgmd> AD2:R+RO`&e$I4$x#^Nr0\-~O9-pOym.|‡~ʹ C MfM0P;RB!homV+ M02яU\5[_0^'M^%cjàaPNy)Ǯ,ә\KΎ{" `.KLF@I2Uo^"ו(,zcuf.[Q('~|mSbGx9)Qh%Xzհo4LcP>_eþu /}R;_bp7T6hC|' gn "9)f#))>JLGhK&̪`-gU0}TI7[>Er ,aVs vpb7BnARD}a2BQ4|JV~C-{ASѓ2 ܲw ʔxIypS<3,-4hMId/SpahF[5`qNDNRF9+Brc.ORfβֲ$f)- QdK\oאs2ّ{~ᳯYO_W\}`ZrM s2KRش:D4>d7X=ƣ+o+ʻIwUkԎ-6'ՓAK;Vla 7-Ϳ(glVU|a"!mq?=DE.0s"3qS= 9xy:8VcJF_v ݁u=͟_S n=rmj.)IJ"B8K6ޞḷP%FwjSE\o67i8IXާNBEϧi:Xs8d~1`p8߽u44}"m;%nyeB? Y{&#Z]\,Fv u^$rj>af^f׮) cW|Iq._?`K/d٦-9]IMQn\ ,Ւfe!1BӭH "l0SLS&֪J&yY!xg]*lt磈nѺ{x}u~F?j:*|VꞪ՚\ b'I%eSr v (oj*)5_ d(T5*MԘ I!E%dеlŔW'o@\Lk MFWʅ 5MTrt2SձV_:&I`Xm?WT,ˑUP 򈮈 l33جP˪18}AGO߀7Ӌ^@Th,Qu]/Y|H⛳Kwa K q0vS6J")+Y)NSS~ߓƷR p- KM20d }Y"dNPOrN٘TU1VоB"uDD& MA;8Oé^Ry_梙׋[)o]O?\uol^'s 3Hv>SlMُPIJLNwJ2z7`[#qyCJx#:u_H7Zr b_, Īp$p6'<&YcNuљHKU Zu,@!L }3;ބ y !rlQw?sܻ{sKYηqgm 3M@ׇf|Lx@x5p/^Me7uckθt! !CM$WU‘CU{tPQ@U9xmb,Էߘ;]Ȋ⠌W^yiײˮx၇>&j_~ې|;œWg^~w|7Q;p|L@ֺ;\nN2shTsZ; `8]( ^N%>C8,|,%2b&Y{} T9d5FX{n^U8FPJ*ŻjPaFK fsEACH:՜D}w%"H[@۠)'  Rkk@&PdmBvzvj{ۏ ߔRZ7?UYdir>b~N3]Z-촜D|_O^%. 6(򚆄6QbU±SKy 竤|[ds/3j^z,姻H6G!:5 }#Ɨ-@馴JNbY˥&k]"+h %5H1B٭gRBqaPpn.{DǧbY=J߆}]<Ųr 05Y* 5ڄ_dr11MleDӘ }cOjTߪsY{NݏdNȝہҘ>=\잺./mAуᅪwO/rĤ1OyOsPF2rXiz|7W})01݇w΋C/έ/?˅qɷkXtw!Uo[찥A炆cs-ٴm+6L;mi{wsQ3Z{4 ;oJ"`D8CL^zI%{ [5 }{v_ }dJh]}=Dh?V5`e/ LWT1sIrI`JB[')U[u[TT[SцUW!D(e'?/.#z1GقO:Ԓ2xoF#mThK9(Yk.)!͞䃫kۇJO,geB!L3^p[ SHD9&=$fphzDnǖٍP="CJJ*y0ڥX-1e|kDo߻y @b_bES j E@Mw߻^<=&B>첾ղ5q^CW:vCOڢa[rL1~VP](!J6rEIvq0Xdg' 7Z" 5YTAUs+J \**9z4f5ZeQiQҸZe*h"]u +\dugO;1yEҎu Q"Hn{ڄ(>@1#Uz+G>kȵihRTcM(`: cʍhzf+V7E)Nwzvy*k"٢PA-S5GY9Ȃ%Gԁ*K8encs+4ă gƺ둍9/m~?Y2REu^%`P h09Zl)8JU>dmvDD24Jyyc1n*R㋝>2WHs&XɆY6BMETTC- *y0y<`l] s MC^hVMV!ϷWȐa.q@%&$L&M%/n,(<05,ϓeyR٣`Y`Y_ތVKYgVX5 bv}^;ENSC5 `18$W4h[Ubæ0mmײaj%<|j|ѧorlр7\>=2عN"@yR]~.;Gӣf_`؊FŘ*=pFEZª5 e=v.] ii9?1ESU_;jG_GgՆ<-`-A~2佊l Npk=) XU!X*"6MER\cA11f5$YߗӣG zvvova^lzBt9^wmFo6I~/fD}0%R TGˌΓP[:j2QcߌeU ƒLENB%Lf *ZG&1JwX7q87ZnKZ_5u5"hsLls="EQ|=fvBBCP~@*QIC2di$ej(M7m H?Mщ9ƒksDH"mJFB2pNg D`6od 5 Dbnyp g$Ȃ0[b\ etʠ8{kozDq];a@Dj%gYtP/*f'tu DM1$a$#-%tc>cպZQ9pk$@\0qIx b8(Q\hț'FSBXM(-E+ o-|ZahA*U]dN,2Vg 9Lm~]_K$F&XjL)ȱ?b/k{fژ;\͒k׹18 x,DQ$ xCL7iȭJ$d2>^`{oKE[w?ShZ(qWRU˙# qBGAGZk}wt`Dw]^z U[Dk4c$0H x&0􇦿ѥW%de`Ϡ?.Vo's܎l l2EzvXp")-}Er˰}xv\zq;n1>Cٗơp~Õ![5|ASZdtAh*AKI`!#2y7Mfr chшR`(R[V*A-F$n:5hc_ǠTB9#`F-#vuAa&ЉLgM="<-mқ y:݃snli־]":`M]l˛%pθs9rk TJ̎ceU4YKN㹚-Nyr=[]y~71l) rKl܁9fkR1'Qi*)%&4GMo&89V_f[ eo8TGuE6H! A(QFYQf'ro>[lv_840PE#2=g /p̀W6SrͯKYn<֨ t dcQEH HYWeC!j,#Ɵ_N^Rٻ6W \5YN _O $ɘuO9 yϨV^Q6JzYymp%d0>|nuzut,rv|Q;Ô&29JmHIh w#Lf$zJ^`饝5jr{7pTt0Oo)KЛ.Z}ŧ2?u&lv`-Mgn4Ȗ޵~x![>?OW _?x1]?`N_(?ZC(/FOxK\I‘6E[B+gB҅3s ߽6md5Վxmx|-x7V7f^'UW~C$OdB-Vej6i:{U_lQPUxL ^͖ꍙ{K8R"u%F]ery-*S URu&Օ`XWW8;'W7uiI{|,y5 G:Hj:=9;Ͳ?QK$*O4E=]ӻKsL5m㸘SۥxH^Où<}ľzbp`Hѧ^^)'gfWC*}՜eKs7*c@subՂPZ3 0aIkzJ4_aIk4);i, ) ֑T OL`` TOYeNoY:eV$rIL"5YR6τPOPFژ傾\З rA_.}/",}/)}/傾\2 )TA_.}/傾\З r֑1T\ rA_.}/ZЗ rA_./ rA_.}/SЗ rA_.}/傾\З rA_.}/傾\З rA_.}/傾\З rA_.}/傾\^A_.}/JەȜJв/䂾\З r+}/傾\З rA_>`ߕHL`_i&WgUL387gʹR l\B(A&ɐZCP :kni 2uPN;&ʬQ˭5drgF\I'XR$ $\{Ʉ@T PԦ?ycpٰ,+#+t%$% E.$5*ieb|-^;cM2xdd @Bi^_mZQP"EA#%A i+`OuV jd{,p[+%D hJiKE0_ je%Sã7\*BaLd'LR#$BUkai<; .D$hGȠHP)Fp^Ȝq;M>ώli',# ꄓ#_&GF&UǩnF6wa%f|څ?M/wF(:(8 J|ٿ 3r~~<^ Gп!0>~|ºITĦcO6HV@C[P.O-nkGWF*S{9jguw/?p֟&?kW,#㩷|>s"(Qǟ[fMh*츚NNvHD?_iH>a\ ep">e| 3(Krt٦n|}q8cS?$ Gmnl$F:P(דˤKr5ȑԳҧ%d2 (D̆Ǚ/mQwa{rrMo;NO&&+rW-P ٝRq|=N,E֦>žctWȰ \Qicu9+h+k㯸[n*7{q~IZ>sP}cYg>Xd}WVQ/>}`-̣@k$hC6Ev @BE@GC}b=0lVz~; /hxT2iIjbc}h7ybe`qF뢄@`!h>"x=%m-L:CqJˤ>I"ʢ<7@nAL xDZqPgLZR RSЕ_]9QtLr/pOX hy0(IK4 Or%}vԉa<`"Oyo7|I ; 2Y$Zy $ѦU"QġĺD:'z  [o$O(&g6TܾjO޴2x!}Ŋ]I_cͤ{oP"8.r'[Q0^8owC['Ɇm}F{jꍁ/v4_!7M;sgh= >ߣl:n 3_Wj_x*XxkmmSeL=AԃzWc|mL ;n8w,L;!kiEvRNOŴO`Uv`9S0NQMpeR)o=00)*j\DBTؤXG/M !T1%$vs~,J Mj[N*2^}H]YFmFEaxHD :WZT$J! BA@ CXP#aA-Q=C4280 'Od>n37v,~y3Í&|,X{_KӽAÇ6 A%00hi4,DJERpEȇ{1VZz`0PWr7J[H2T=kAd 6*I% Кq0r֌*8cW]BQtޱܖ(\lo8.a8Pxe&XcFM'ᖋy0ICqǡhm(ZXq<;$\H:cpIʔ VK <&\ RY4P2ZTyTkB$^PpX$TGep0rևS9ˊP4`FkDY4bшq3(ޔ=8#}4H7fPj`Bp!6R%D T~XKJ#NMB+%Xz5%UɄV!%9m`89[$.E\jzY Rs0.U/\ ŢƲ^ vt$Ydt})J).It0ѣ94:ꬉE/C/>;EXl~aOGn_1fF&AnW_(qm~@nz.7|>L@TQhl-ZؠjCAӁp"/0/L= Z`6MYuFA"n:qN 8A ^hݛ&e&h'c3l"z&a<[hrDA$K7l 7 R$p$EDMshoLCHc-w"0eOBs2g/H,(F?7ti!u$fZ%ka1* 9Y$@^KE3̤\6&#ND2̃Rf XtR"c$cuyY{à[v\cm-yp 7c_υ<]s-.|Vb<BAh"Lk^ K\c5#+>b=Q?{8' n7l/A?-#)߯zHI|J=FȜf~]L?gdyw{$Q쉲y*#MĿ{[2R& Fl,Q 9 =OG ՙq"Ci+mlN1e1l^b4ܫMxY~n:^$Fj $Gh8mJmDa&c, aƔ j)y+[ْjtLl&-fcV.Ɩf)xCkeI b (n ヤ?[g/2>r!$3^Y 9~R4˚j½uWa3 C9qjScq[T#aI-r2 &/;ъͱ'jmO옟HhXEocdL2JΤARަb+ƽP8$3' w"BݎRU BM1ֶȮg[f61)1ԉn.ѓцu),2!hH,أ=*ّ;$k;}eXZTuj@x:ggr6A,5ݛMq?E/Sx[MO *bQe)A98`$+GW4II֣PCd3BX%KNeYYGj=J2sd(1Rw^R9r0 EӠ-eU&:' ]-I/~,k:۸*iZ~^_^~P|I.2 3AL eQ'-'EͩdQ/Ԃ8,Je,_auVX]=>oWڰ}*RVԕ0tc#w}kcnj>ɀdCL> mI9YY%VˤR* d۔.} {dG[-s.{GB*Kz҉omJRs,qÃwI2Ό 0$9٠BaJ#ǣPQ]jG$_`+iMp׽nkc451)v\9*vtl6XlXL55Ʃ^ޯ5HzMvݻTNӾK7iOLYsD[JxI֐d4#?&⿦Q;hqV վ!dCFu{`@G$zx NO,ZcŴ"{/Ӯ|qsžZ3xϷpoV8"W6ܒ+<8"kˮbз%r-iRn\9Yu- 7r"u=ҺjZs0XkgZVԲ]7zk=/\sfO;-ݍy=OywbyuCӻ;o75&[s6YmsZ<-<nFS4ҡvEY8.b@+i̡Nfk3M>hZ[8ycdB*Q+.5HKq7`P')9tzܫdzc =XE1kNٱzj28ne|T~'D>79D+1[ \0+QZa۳OQ :z6ķÛJMdYh;yL %o hOƷU֞oQk$o.7̝od83 IPdⱋTuoYYOrU+ޫ\f$f$`{L](yd/~Ջh8iIwxE L!dFݽV&d'# vlOLjѻ* tb+ӨpB6Oޖ*>)jK\j?Û$4wZ?\\)ύ~ۧq8xcS z!TΩ1h"&*J -8ddc8ДJh,dmL ~Hޓ== ya?\2N׎1"͋m-ⷳ!ogE =/I_s֞~9ZӃY/v[#qCi~og%֢lR ԣO5FT 2;1LWċrcEL iw!cNuډՎLmmO),1M>{; !0"%-AyL`zQ[HQ  Ez٥ą`&G;tTI$5\h9ZBmC\QNyQ/69/!\f͐ 6Zy$ S ctb0Z Q&oۍ$DqxX^( Zg -2=fγTcĈHlvek|.+\0111 YT9$CI"Y^D+ڑ>4" 22^a1!eٝu托bAjGg!ʌe| cЊ8!g%2# h_+8VVdWup492ڭm]cEqVdkD߳y|=-vtr;?z% hz3it4:E%W!VbT_%$)yp$Q._#k?90;ST ,!y %x lИƔJS c`Sӆͱ&,5[Mlkd}(B8GZK%$6*=:q msہ4&]I GC?9h3·F GbD#U @8PG@ -`&1UPa\Ԁcڏ7Y7oui +{oMO%o :6:1P$Rq.pÄY Q4hS(D8D({~BN0tC]651f-XrSʒ<J&-mԁFVԀ{)6 n Kwm$G )9HV" 6e\Mp_%4%)i̡()-q=5UUUO">FjR 'gm1;G b%+m#gz4~3/-y"yFwOSGlyMLn1ׇ}e#R8ŝ$S$$ÂO$1 $'4܄#XSFLg0^3hMHXޡʏd\8'(Jg_࣑@%pF4O͓uǙ/ *h8̚㋗Hp)z'`1=M:0핊Q }c@ %7Q6H}9? }#M(a8dc)eFi:v'TBgnt5K5|5O:K^m״;0DFNDVvm 5)N'?NCU[=FN7xKg; /^0<ݨEGлbӻ.}zwf-Ӈys. 0|1QM"._a]7 6v\jx[%=熲Rmy]7]oݴgMd.mًd>479Gu yXkpB$k}ˋ> *7[KW̯g<9$çО,9f*o17߅{@%WW"X.W,3b?\-~7}Y-CV'TC:&2yUYxRxBSd{8\Ѷq֥8ד,21s/LB9O4^}xw>~%\F7!9\x`NK ک_W=a7F̪s޲Ie:GFnǞ!;eⴙ!oཛٟF7u#!a{&׎ka D+>`|^u.f˫׋4)%URTl%%B )x܍cp0w{^U/VzX| ܋^G맮}~~&r0P.V(L&8V+Y-!jt@[QK(΀'-DG9"grkq6d2#BA.%Hjc}J\&YEWK2!hZ(ҦCqϲ<^[^B"YPBX됏 Lׇ,(5f9܊x?X9;% :a3%#SJ3`okE&wӊR- B(! cn6+9fa EXl/5p[+%D hJiKE07[FLAB .@!h0 &ē &_#$BU Hei .D$hGM!r)Fp^Ȝ‚;M9q윺"h" ꌓ$ M(M9qb;g ʤ)~@C+l8K7M'1<(NFg/rNrbN >ık$ Gmu6#(Oe'ruԭɢa *W &̆lQ4w4v}=;ċzQ"ԋ߬|]J̅^8.ci"9 M4(7IiG8Wd2Oy9?vRWVηQ< K]肫(Pb]"SssG'-"yLbmxyO[^VᗍVlbDh}0HQhT:'`NE=8:/qt ^t1nGze;U?2y~92@q 3^v_\Rc%&2 54rg&JW}Vu9Zp[F˻OT^Iy{B\ 'U9?)>㓗o染)Ӿ)ϣ֎lȾzOzlNYdӱ|~p oZVym 2`}S/M1A_mCkZqܲO50z"Hsf8Q> a VG[ `N〝 )#8Or(,Jܧa wahYซbjVZpiWr7J[H2T=kAd 6*I% TiX5c9[.,B( Š mՅkq-b|l~7j;Ooǣ+5 $R'I4d^ 4xo\iR{.j4ݤE56< wq&>Kз#$& 2W.FM'ឋy45CX6 Z{`q<;$\H:cpIʔ VKa&\ Eaw@-C*Ћ<*5!/Yd(h8\O, xt>,Fq9gY1F,jDYX#A#q;M933"G#hHu&"Jk#uQB@˶4:>1$2xPg\p^#QRH(L(nR kb쐸|q1:qɾzpu9A/zqk5J%" 8OPOqI Ǥ͡)먳&:?zzXa18}'TتyX-XG=}_p6I! n~,w~'r0=LO=jUT)[K6kt ȇ&:d =&tj҃&jD$'=t q0B`s$sU&T C}hxŨ5q:B l 6áx5ݑL:[b.x*F@ 4d@Z" kѸ)~1l 1Dn^XP{>ui!7Jq1ʹJ&b U(T-WD؂KE3̤\6&#ND2̃Rf XtRAƎHf[go`ԇ 2t5{TEvk O@0(MĚikau& E=TB+yye47+_c2w@3gzɕ_)9x`  @re/X,rFd8=ŖdKdrjٽؙխbwU++,\v D-sJ#Ej7Zk!)I@(h)0I$ιTdHk,WmB {.1с)il&#lJG&G/Gyωr;S-/ͻI] [;ұ$c hLFICuΑ_LǙ_t6uM6uM'_ g *c! Vلd.;y|H,\(%ZZYEsJEEEDOeo (B1qg$Jؖ8/w)^oÏ/8}s鶬%0|I6?r~|oI WALWOWŏ[|7>'i0/3`e ;D+!XMQ@oF0 G{z >r{, x*B|{8ĐdqYŞxt$ܝK -("PA| F98CnP2G4Zk5Z WC.m}k\V*|ZYԆw1*0AؒCKc/IHS^h9%fԠSp؆o$gנL;cgmœ(1KwOXp7=~ڜӤ4M&jzM~V gZ_q`T"`d`cuIQ %2 r|JtmìƧMTA Zlh%x0 .esF}!Jjo^J]6d"'4A0%ݟM !l! (Rq)<m-si~,$Egdw(%el-tuR|RO5"gq)iB)I,IyC,Rr[\vXpogRTb'm*̎X'31:;sɒtvVe3x.HpPY. H/]seHpW\3M^lg5vKf&od4zLQpW .ksFr,"Յ`u# 1DVQSzhn^/~/ Z@DK/(Zq+mS7Pv&WY1{Jh8mnĤ2wJ&7)aJ 4zC’6pQ1ͳ뫮PRʖ)V#Y>4"RT\i F`s*x*j.G[AzkkoEddыL?TP-߲vڜ () ^k/IY6hjb,%[}o-T4@P]X\-^OĚp;q9šSؒiD͗[&$q'ʍss{4& F/]O3[jXZHLhH1"d82o5 :{ /y7cUS Q(ZA~CD\QT$hC : YA5#݄{nͿ_]b_q1LM7|ȣΫ3WrH~0X6=w^2<mXUn^׳/_0*@VV`EYFת J{HuH%Soe *̙R*2Oca{ & `{:pFEªգeDY]prfku։iB +a;6{/J{/?l\ HKp d2׶$[GXS,Q,B(`UR^gY#B&}x$ҩ0Bhxh/' Q-÷8ip5Yp`^'bŜd4?xr֓&Ke 5VfEPFM<Vx7 Dn `B ]ޓ`)K@$]AX+NF2xYۂ7W`ǓxVe#;[zڔo)}˷}dyȒCb3)6Gٱ4P .豏;=*UH4*MmӬu!꒝C Rhtӷk{^Mٚ~%c\SVi^~~{R|Dw,b?k^FI RAmb&B򦇗=>ei^:DY"ͽ&%`(cBʡ3KZ¡޴ˌcs9].:sBEL!F/p"xMh^&ΞY1r-)j6ҫ DbvҦ˺Ğe]|{?r>-.1W4Tt9mXUktTeU\䡯P@Uzq~z}i0 dDby`g(μ)b`xaC2YYRl41X*&]e=|Pɕv~~ a6'\uKvK 们nv@R˥x5~X ;>]fV≺ /_2^Vܳ8ly|n-o2O)L\3 a ?=L&ߛ-Z&o ao=.x7[XOgK+/-[W#]u+䣢Mk]<~zώmtIRt6׳/Os4qsv<4 ҙd m=W;y {\ynYwcG!W(}_qf]z#۷}7_aj[&jKlY]WNfT9/Uܕz/$y_$Hbo^`Fn'DU%Gg7r  u0?ѭwY(P\P@|0>Jվ蜿/ņzÊ*R}9A2}ϟs|/)ah{w6' ejB?UMg䮪|=Vqٴ{TVckUR]AwV[)]U<wU\IDW'GWo]1Igt-.Vg㮪FҺΣ+Dޢ2NI8'wE+q>kWU\u6JkDU]Awen+ue}2Di`=!dʬI }8yekٖrzL€Z}jN ~LƳHOsH/g̈́d(G!<+>|Μ5|t|yTY4[t*-S}[Y dZՒ6ew?5StRe=OIkRF#[YTN)@F͢MɄʺha&0O>twW++D276W+'n V[~zg88.R9N<[΂+g<<&,߭W-r~ӡI^x|!}R>ّ~dRkEM"oy^Iؠt_Q݃܃|xx˻̃}n sk}F|2l2$伇oxoV6af\4,"s89( .-ƾ9ϫ?F*گj/M$1cF%0^ r!K' $]]a(Q`mNz֞0$<$CQ(4J"IZg7iDJ{h͖ڭ cݳtko*:'yCۮK5L:&-::s#Yl,&LAGD7VwQ&%5-FTl~جAKբ;صv+m_HyBJ Go"a^F6yȜ%~6Zttt, ( ,YZ2ھ+rPξ];C:v!t 1T*zhϕB(lȳV; mP^A[%$|¨wG\2{}cmgkǻNoV3|NɏNp-°!U G7AZK` '.ARJzm*={kw Y_ 0&sTN*'%[#꨹" >d|1 '}:i4[\v$_ |(r|Y K..IWQÓ hJm[5 zɽr8E?kQd)C$#Ri5.lP&K>W3skdTvd-{[S?"\5mj餽G|;wor Ai祤YX;anYͽ\rR),څ0-#+dmXzrriG!+hQA j3&nViBݲ/1Ϯ+2޲=?'i4-Fo~X_+:92H!щ!!T{aJ>cٸdu F|NPmҭh( B`SR.1HR)*(I\| mCȿ~%b_agO/:l],P% ُohfWQ(OY֤G>ɨgjx\O$e!ܝ)*;%P,%KJOꝹtjG:IOV5"O QRR)GR(ȜZPNa8'1g_?8jmdz\L.ȒXҮaog=HTUl$cjxJ'?Ү65~=qdr<qǓzc&YSD(6T( hF{M> b*C[N+%_ͪ>iBOg@IO݂xgW+A_3/kpZL}W|x [DLb45mj6'jp5f3-XU9^b:ḋ@IOn&l!C.xfWRjf cݟqm1wF IGoHEl36[djr4qBlk'%J%J^N5RPJ]&8(e )ZghF;u =3 IQ2&46 CL\L$E@uSe:+qť_S7\7 28?BnC^ܜ[ 뵳UKms&ZEYM37«AF z{Uo@{U)z0=‘> ZLCZ|%g]k0N BSc(.!yD夲GT;"ߥTtR1h23 t2M $tBT 9 f*Z ,PCF?(G〔75B@0$BS ԧ (eHU,CCS#s^yWKP9a1I:tRlW}M?}Ȁ2!ðPjXH<i%] + [! T֩zdcL) l^8*f Z'l V/3W/T^Yb-jW||;+}Wg WsA09QP&bT(7Sp{H~fIS.MoKHGek9SKd^2IFR $"3P@w[$Y#逌R1yU-P,!%HFTTN:t&a8 3i-fƚy;?\/mwI3}xҗ;KX9u5g$8@7>/}Oivncg>ϛ).AuP`:2b,IiK" vi~S:}cEZͧԃ:E:(#QH@L>C¢cv7!$+|҉)(] ;sE<$۶id2lx*JyTBF.d44g#F]W/}O[ wvߡv[>>~'] Xك﫫iǥiױ~mDژhY;'iҲnH钭Yd=Ě|0_L%\up*R%O.J`Sd~[/ zK.zUeVp&ZUlDm{\T^GG&`VXi9 TZОؗ)>E'4('X2zO’Ecqs4"5ӛ}O_v:noY6=N =R[&hqr{ڗ/W5N!)mV:*dN@\XK@d I@ F)HS!XPbր>C1DxRd)ej\V: +q[Vf^ve}sI,v̆;Ov-LI(b[%)\mX0K+G@0:eeY]-鿄bۢ;޿zגlv̓Nmgt{7Ϧt;Wv?W}nOah'Jjh xEl !}QAo>>~;d3+,,F_uj6c*Uc|b6nւw> vWpI FAZgo\"K$ӕY>X"%Esql?*T;sWܳs:2?|8Hue/-t&d>o\=7nf?%7>G ֛z~MկcS%wdp]֑CoGF +PVTTA{ F'xn/1'lЧ.x'չT eY?P}v0oCax4s AAGD1lj^s2'1ު[-\ROhPJ+(ZPt4`k9¨tZҎRm]#Ech}|}0!jG+!H颁h,wXv&a\U}hۗ[:2.ygFMo(ܣrr@uR|VwuE@aRb,XE@| q[%e @R!DX Q`Yn 0<ٹa4M2T^X[KfŤX"&c^6PP+dECM!sBPA\ڲtx6fDf;L#/}23CW7)*KG ą1EWS/6Ych},H[Em>@&G>OKHQOFB]H> P3(F \ۉ1X(-CbEVgPWOx+O_Ns", NH ɟM$ॣ]KsH+]ӐR=m:bgcz^^٦I6In,hI@}q[P_UV~Yhqz'mlFmFc{ R3Cc'emwn?m9C S…$BpAQ`"rkp'uLPe5nX.ﵭv=MNE"*iXdxp Ff)7pRrrtq!DGS oWN1Nf?x8VO4_mRkAhCysx3*|@Pp 47­Rݢ%1 %Zf'sFL> d6}q?hWj|znLZ"f8`Yj(B >*DgOY=:z3A!p:PiM ?oD,Rh x6 PǙM@\@EtjR TNbw*G9+,Pzu\p=<;#'Вk~[E7#J5f|F7q:/N)GNRtTb)R ) !)Dr5 (=4CLR~ eZ~~ =qnEv#r^ź\ly _q >T=na2e-߫|! ft=߷*MMJK޴)5oղgeɸc=k(#݌]r.-"}6HxH2~-R0\B~DəAhUw"B21d&XV*t[U\fVX[I3E“P]Rn{?~6"" /-W(3}eIP? T|1ijg6Oޒ+ZMQ>q! )M9*s5 7>L`2Tl_]=x`HUe Tq٫3VoLlMɅo5(/_i:e͡O ohvVnyrgCv\E7(B*mF7c7,iC:<* 9F ǭL%},i}IdhW+vl45`SoI-wYT)51+*tbD %zC_ӝ ?:Ÿ'wkgDN)-PBE. TO#L H; s]X]6( ]7jvS[zH1H4+6JkClnu``ʋc=dիQmzsԴǟ ˅dvS2vr1=cEjC0&q-EwSxe1J u^yvLmI&h 2F i&V;^Zϑ sl.$k NMsŦ-jox PDaDM0'HL)qZiuB2f)jꍲu{KJk&DJ J .AQUp +HcpURy=+KEzUDY㮈)IQ -&*UБH_P$9ze9G]-jS͠CtkBtCUB)HOW'HW z>.wbfv*0.mQ8DžZ82G7Rjb7vCHTrBq*e,'g~f ZZ.n\m_vPwy}FtX>wKz8cC6>^gB+DτeE<'.N$N/.+t)hKtut<뷉} ";tBJz:ALkq{R:cF״MȻFoF=Oߍʤ,s`HO7[X.y*,<oI),hu7eH= A<7iJ>O;k1CTq{UBJl%eV"7`Hn+/-w%_\~Uv8 qq0ߟwײt]nvr!ZXG E g <0_\j ƣx^Z8ƒ0U$ho*S uQݯŒ֢&Gq+S(??*r -|ΪٯdAeTylU"ZrU̥rU8RF)g_>'xH)~5be3^got ϋ7/eل}]cC2Lc=*_VVZ8~ ah.0j}dOM<tP?l8ͳqjq勗>,ݭd2c)U#z|ׅ JP2Qd9pr^kfE4 7qs1OyUR .6 g=$};z) gNl<ޞٮo6:QEV墶74:qaԠc29Ɖ&:F\|n]> |e~ojظ|Ϋae)_H{X><1I6hMʣ<)yУ8:N?-MrfKtXꙉKx2Xメneu~)aPTi.ӋY>xw4.ՠH$RI,Z?({\.Ni)NǶ&=:`'O`sz:Ncas1NFyxl4B3Ś;ָh,&ui{v󊦶ꇷJHH/.w\gݽ .l&; a]CP>#Jt)gX]%v$k'VOWCWC:b%O=e%0*TryK%M#`%hghjNh۟PBoU"MkiM;DW$Y#\`+ttyoU"]CtSDNW*; =]5EOωV%;3\~dP[jWSe8+, ]%"Zu`J(5k‡X_㖓GLۭ͌ňzy{Dk֕jTCn"2^kRJd+ٍyK!,3L%x<~z,[}}f'뵑2K/*Yʑ|`'5tK'rO,UXBٲ'<!J:CW QBH*/NmK%Jew>%vZ-NW eۢzzBۊ !B?[ttNᄳf ;CW ]IUBeOW'HWrcڙX K!.99IC]M[HC4) M' N eO'Ih!#`I!e*Zh]B)PCW9!TC\E|gZƎCWۡU-t):DW97EW;lv7$b㴔 >x/(vdc_6w"d5] ꗙ/ޫ̨J,!> 7p7,@)fc+mj}v͟?$rC4p] ]-pt5Pq|+]]IeAtx1t5Y ] :] wvtuo?/5O`~Au8{Pc<}R@ڦO/ʇ'ye6=^竷ɪ;ͱ%+IL.Gqlr̎B"W(O!7[HF|m e+$WMO6hrslLjz cRulfR2B:{gfmNY(465fOBJ٤jA^Y)jKeVMu[7]8F}\{=$T,Zj|WCdT۵2)g()m^6Ԍ1bN"j{D,=:d0vk!17f(:!)RL qz7^{"7nՉ&ShCGhCϩHjX,#lEt,1T`&?a|!ƬRN{'mvZ ߚ ED^7&I]kpJ&C*d2Zڋ#Q1'u!g5wXgE|j$7-wqZ3 $m9ӝH9YC%x}zs_-0'Hѻ);RcIE-)$qu$~bs€bsQk\.`Q jk*DH,8dfm6;ChL|"G&_ɒgBj2) vN Y dvomy \+ =sA |fp^aP.-[^̈KSMW̺D$'1MӉF1&}::/lǫv57XKį_Ntv`c{^ L} h"X 3 o b9PT8xiPtdW%S(CҕjAhW!ˈc*,OpHv9OU%Έ ՜ Hd"UA l5tqxѼ<04'6AJx$ۑ ox>pu~V% 9ՠi%wjX@6Z"ةi6B?$nvy'K3I}ȅBOeLl3f߸ ]{ #EK|uI>m xC$*Kt]/sI3w51 jki,D@hg.ɘ+y 9@'^EWHPvm!꬚QxS-;Z=`1`ȸ(YCq4XdЙ$ fj2Ō ٕ T "7H ת{y0 !S7>f5Be1 mD(ԚϽ nqt֞Ewip–@ pf% M]am@RtұVMO7mp$` hc/^iF `f, imUGjhM.OvWw@M,>o˴Wgێsp&Iǃ+ գ;v=.-z֡$}s'S"bjmTXkM(s)'ޮ bt( ===A4̈=X)k8n0Ȁ-ZS~@'؍ڪ/&Y\zV`)Ð PlYZAz|B rw 7ofX` /߁y"8(X; |rpM9?E bX 0p x,G 3t01t:yT,k~x,!mL 18c@sf;悙L+Vf*5c#S{/lk&S 1KaZ ]sN{1!{ *mv\>@McM*BjlpgF8ZAiY'X; Z'=pec]lO34%0h%YEqPrm⊑ 7ψ4 a4X!5fSM0*w^. q@C, c64Pdpi⹓:H_` N #B`9{&Z-[F?_拋뛭۽a#?JouiK#fSE73?_æ~&:1_iŝ_~vu'oFMNq7fE)/ANOn7it~>vt=iҫ/wlj{}vn 0 _7z|9ɿ_\]Lo?\~Mֿ2zƶ‹Hj>˳Vӓӻ/mnnMh>O mܶv@lo?t*Ϩ&a[Rܬs\2@__>PfcGQJ sk%Z V@k%Z V@k%Z V@k%Z V@k%Z V@k%Z V@k%Z V@k%Z V@k%Z V@k%Z V@k%Z V@k%Z t@ #oq% ]$Y L/zZ)RCRrAKŶm$VfTWSr}r!W5'q!Zi:B@V@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 tN :J dQ 8eE'ծN Di 9 T+>mU#'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN ru!Wb='vzpuu'ՌKrHEN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'8L7=Zo7ki-">o'ׯ<92~'W&FVaq :㒬=aKr˽BƥoظtW } ;-{CWX+VQ qۡ+̪W' wj'xЂ= ]RwL]J]9ǡGtUgL*Z]R[#+R ]]͇Z-}0&.w!Wtj9߸1`2Ɔ~fy=x2t},abnTM̬)̧frisr ygr"Uc+l2 5ˆXJ~C~[0!\O][gx8Vnkfjw)U^p;߽q(ChȢR㌂&Z+ʮEd*V-l2FJN7ΧT/)bݘjVlGݣf_@ZV%Cu5Ci 2s} "l}>h*h*\2AF讏h*J49TV kџ WUEk;?#+ FGCW(R}+D+Y +%w}+Ukz*ZyBr,Е#0UkD_ C:(yԽnz-?WKl,*^wrS׿6{6~0,K8<0`L3&o~v0̗&|#po LΤ _Oѧ Fmt{Y~m4m/ahi0ЧW (eo eBWD_1 JHU#]9U@Uk{u( շCWUOX]dJvvفvC+4 et^Z֝`{DW ]UBW]r+CxJTe/ơra5#uD;4(I}\'ϩTM16b5ipA=JQ/"4}?0HۛU/gEkMODPyt`zEW؂ ]UZ*ZǻNWk[#+pVH#BG]U:]U󓩈4B:e[tU޼h5t*J+ s-mU+d_jNWWGIWܱي.KTEę3N4HiX嶜[i{DϜ]k{3gh x쎑NѕIp ]U_;tBۻH]3ޜ0|ؠe7Ѓz}НCvC);iv+CtҪB:'Խ ZuBk] ] ]!`Ρ7tUJh:]U[ /IVuĻyfQ9hzN-qݛMvi)u7ITvpOjbn/Ǽs ($tuj5a}Qյd4fu7K[K|ӥ{wޡ؛{~_~8r5L_l)F'8&c@ &z_oocX"$OWu䎊5g^UޜzsIZ4_':ff2ϼOc ?>"_h:妺_OIWUv`Y(oO?lnfrvzj`TRaZL}dlQ *daU`7 N72Q&Il5νP,"3҈#v' xLE#>,h%kNUé{^c6jr2O[t6VfKZxqRn_;Gcn>?G]4׿'E=𳟟osGHK~)孇>j};}E{nh06$0)G[$8g䊗`\E1j0TV(3% k d.B :e@q$+y:V )݉ȖáT,j!B.N(2XN9QAu]0UDq~zOt4g Cl>u%=lLAJAt\;A>>}ڒ> RD{?ͧNu}sYp؂ؕ$lEwoժFĺ{1v!yt_*չC/uuK/u3x q4CA#ML6[X4 \}X2d\O(iJAWp3"QB4BdV{)rҲ,[\r}k%U[uɱ4jPgNu\}ێyoK<(]o p%bHHJ.@ȋؾx*ZEBZY"IJl}xILc ƃ `kml/(^|wm ;a9b9ɼQ![y5Ȯxg0ev^x%tS!gisLWX fx;cN#28MPpi?ѷC;4]ɾ9#ؿ " c+cTe<: â-ZDnÒ*'bDXXEzӸo/)N%fBkK4EKlY -gfPT Bb98mE8ijЎqlș[05 qLjDWfXCNɖ58όE\}R٢N?PtWkq cȯ+[ Rd#FE}?@!f\+E)p8XyDJr[EIfYd9z4TR4+%ZI88I?ofv]q(3 J:Յb 6YnbRv9H"VcW{}1Ӌد{6 3-Y޻M[3*ۯۖO+t)1&dJΤCJ2"\ب 7-0/m͈ i]7Xѳ_kMZus<9LG!)%|v%JRAM9˄("Z70£؈&N.?u%:Zk!/ڍljV;Ga8FxIY]Q=6.DM['$Pn܁r4g[[mc}2{'/zUs8cٖ8]NUp:E}Wc9DA_/f!%q콴#c.3y!K$S:9IY`ÃR!Qz4M_LyFm  \ADԚ5}-ɭ> t|}ٹIC@''( s*cjQt"T~1rQ\XO6uThUjd% " )"qFq5Xd_S%mb:2}ywyʿcs9C(~+NO//(l=]}O.bj|yͪ{K>분Ys cj#ޘ}Y3Q%xNg(\)qUp09NEo+zFaK!TcPκh |x:Cs^H>b1|d\A#,עx\ײ&!5s.6FByƳmUcF(6m$GEwk/Kpɗrp@YEؒV3 俧jyfe`lf@SYvCzr-a(r"T0f+LE;_h:c0'2e MD^6Y8>%ˈb\r2#922^GSZFí-$3b&$n9"Hqf# )#cYi',Nz XJʶ'nLc_h׌A)PLBhb8ӦSvIݖTnǷΗz˲9Țފ˧q MIpR^*o>BcpBk|*6ǖB+'<ّEoPLM&=@5jή4C{N5T'܆-k~ eI.x|"<ha=3W\GBӵNgɩ=m)y ܹIiSWWP0.~vn>\Lۻ,F_FWѻ9t1f?7ΧYOH#+Y5z73/ =0P7="duXٯ9_fC瓻5χiyZSyÁFp5rSf֡NEYtot0`qTihh#n*:Ji(EW\%#8I]1{z/Bo+:t]zOҫO[dz0.oW|{ʋVk vs+8-Ӯy#h.i$a"6Q*)$&zaE`^:uޞcB$6x%y|tz,C .0oa|_:~xm?hR[bR"TApW9*'n9u .[[I|| ȳH< zbҙ   ٢%5`lrhe6vB[CJ\tmzpJFARd0 -mt{ѳ^zrD kzF #l'  s''nN&1vGgI|i1!.*&JcD9R$@Α6[,-[V֖\p5:NYIY pI`p"y2DoP-VJ$kdE40))pM(1;pYRH3cYdrFFZ+Y˨mED̊sA[PY#5׊W`AN+ syҚ[; f(].069͝d ħTEf*om N+ȋUL.T@v$]KWBBGЋs/+Y51)R%"OY!xO4<dOF 9$qAt>C60ǎy+'>Zϭ^ڻ9vyGH=Q2h,BPVϞVW;JCɠ5ҷ֋/z󬷔[l+.UJDnȎJUF}PuvtsYwÌw ڇ5 9v˩EDC ) s,j#$0K^K ޑ7tN 8(?׏~Rv W8!On ޔ6Vxe Es75xr?Ǫ䛍lxT6he4JD2F@,3a 2imI#;ӎؙ<ߛ1\ wKO^-JH1^ ͑'(hYV,Oqy3A7^׬H q8tPyv(XS3EJ,)WZY]-:{цP}zS6 sAȖ6xQfзS܌GܱWL9ƓdH ,YUރ336a'y?t,N@;O&^UQKϳLoikISF_u V~`zG~!,uy]im1rHRIhu"9-׽_Zo;nɴKzŚ=&M/\a iCڪ>mooZkqn۳/IbUk{ {mz{^[ڋ0LwiG,sr:*Ai_gsmQ^T}\\_qA")RU,Iie;A25CV \*&"xM0')4ݏ?L~ܶY{zYo0 u⧭9lN*OKfko3x&_tL0bC=iYɌvuӶ!?neO_i@"z:t<6UVYx0rnObZ=[nvp; hlf\ aVL_u`qd۞;rsYaM @e=[҈HhUfQY,8=N7if2N b5KQVt0F:frbd6͍ I}Id Y)̄/;[3x ‘C¹ |9r!kD9=8J̸EN_vԮ"U-WF AX;XR"$=ӹ |ʔr=De:5!5ŷS}Y('UJh!=N I Л\M7YM&O(!iTj9h:u׽h6݆\G(f1ia4~vyKa}?{׺~obB,>MsW1ƌ 1J}c8OaJ`{(Gw8 +{KRb/7iV7K92(@n>khBg5WJ9x2JKҨG~P"_ઇpכ¨*,t$ōWӫpdޤ̨.#WnLp%#vqC.d׋ɔ6tga)(|ǐ7Q[`֥r+=>|jDfbIi-\/^OÀ奖"EG({U%ӻI.7+7BZ,pG9 q"|Hfaz[PW+{G7ݴ I~} %ף]iXA)bҚrqeBHˊv)ʴ_]{dqhzM~b-,m]6qӴ- 2䍔`36TȉIMF;$P<nDhT~F(ha<@dMv'qI"/ E͢w+]^7z.J ZMptݧ 5r_, a>|=?|=zMO+3zyAG/?&W7Wa7qi  zkQuzZ"̚Ԓ z6wr׿LПn.ҬO ޼L5}#7uBof~'ueyfc?6cg{Ӌv%ipm3 {gw?wCxwEzkn]T> ܶ>g͕iZ4['[uhQ_ '笋XbuЮI]-Uh*`k D|ۗq,ov2WׯKڝ>6޷ۺ`RO^ǦU&ߩ0uw))9CP :5MӵuDM~ЫhohhpG8Ξ?윎O:nE?8HVEбBlHMjvBZW;Fo3}&T>Ўj@OB}0gO]m†*cM]c;2 +p>_Q 1(,MW|b}b2hD4뫷ˆSק`ngߧV2y=7;j4-;S?|_NGd kpX;zN@Oa\ (4\]R0.x_}(F' w}lSk:X$ch87bh%P# o821!g 5gb?Po.^pc6P9۪tAۺq)d]ƆߣLzۤ@6H{u4 `mҥZCv*0RmUÑ}4\v[Ml tbm[]"OxƶQCRTt}_7MH];sQ6XkT-/l:Q}[ V#)2H$Oj$Hce&\۩}x4phNe1bģClDWXDP<PHm4ӿI!9mRJ7R &WI7-h LGbht)0G3M״ PRD܎lģenȋjSrWXmՄжӕ6M[y{A@%1y~n1O). >?s늀 EWL&w]1 EW[9 EWU)DWSԕƙ(HW ,hpRtŴ5p]1֤ݢyN(9Ab0w]1 ʃfKt 4]aܙZ]2">q,^4Z9]vL ُ3GғtbtEQNPGxh,AuG銁60Z%f6B]WLTg3GWc8Fz8EW(}fUXtuhh5A"`/FWQuŔ&]MPW>z~2˾w)/ >?` ]p9>]1 ):tetދF'EWD!T4%zO0Xb\-fi1{]1eV;]=<@ZY|m=Xh@f V=.V iF0k)&ڨtfJ(E@/3i]WLKu-,IWl>;h#umP1]}1{f>SNLFZWp8Z<q. ztաY銀#Z1b\ic]WY]uN-ӵ?Z&+qvǧ [[BNɱ,rtȜ5ʗ+qO@yA-vrZ4PJh[4L h&آ1^cxS^-FW]1-d_̔&]MPWhIb('9ޠiO5?٢ z$]G9Z1[LTbXSԕ # 9%&bZ^WLiLuq.#&-Y#8tHw0YA&^Kӌ+̴&ISLhz:銀#*1b\'ficQe^mm\t5]Q4OF׊~24_`Ϭ3(GWp݉u56hJT ]:At5 8>yr:/EWL{c(]t5A] qT~N`Xub:1֎(~p䯗ose~ALִZ:;H0((jJ16N:DŽ {H!=*\wR9Rz>%jˋ7_ξW/ry}?V}?C6Bww?;ۗYr_~_j SoRۦW Bhl6TvԂk<:[4eVA*{g~f+琺Q]YT*(eh z+ECF[4LK[4iI[ 3_̴1{]mPՓrKX[+FWČ-<9S25E]q_X40(pB)bZ>bJWt5E]yfkV LL.\6@{͈t[ 4?aF4F-EDr4S"MOP:g ]0Xb\+}=+,Q$uR iJ:%FW>;u+3L9x`u5ׅjm/Oޜz>|"Q&e:gP_+Ve0/׹} @q@UxeȞ5DN(M3b"HYq(t1'$XN64ϙ =tNFXsPdl;λNJrޙJ |^䇖ז}_-Y(iri8)r|h5Prp_-gۦ'703~_]9 }h{t2r,;Qq/evOScy"㙿̯$ƃX>+>܎PghdI졚ޛn򵤦G\c}<gf8}/][l BWO}`<ݹ3YՏ7[a 썮G 4Vٯg;&$ez#zB ᕯscW_&Ũ\~m}?2 E=1W%nsfd}ʲ*I8OϪTUU}e oq;b3D`vѓ3W$$ }y8Is8Mf>߹AyGF*A^lh9l7PkrZ $igAN - 7wZH0G/8Z9׮loZl[avŊFK`eLB>ȃϸ`<#)tDioi3M (Q"!gՖLLꩻ˭ϗ=?oez4go~В (`66ۤIk\SQڻ||7Yёz@%љ^ !>=E:5^3saǝaAj$ḹ-٣TV d.ޗ{(vYu/BN3 ׯ')S"D4.%3["xy!&l/ӊwO: N;)%wMV"\:anRb*)D$Fw|+dXJZ_(O'gU9(n[ +X1Hoǰ M["Ǥ>p.`,OU;4ny$}h z|[LuUe^L'%fUrg:'YH&K eѨQnY q"%I%R$&,@V(|f543wW(f K)^T AZUZ#MIc៖e3t\& PmA /f $2w$)4'Q@4)PO-ZEL܉δSdSpǃ8K߆d:) UZ?Ũx/ l"ȝ&M!P%Ewyg.1fB+T wn\v-ޖi3;M FXNsW}gEQ!Dx ˗d0 +!=D A M~dp˒y>X˃:x,rݶJ&"!=ť1?S\iH"u w^,zX_Q _hP1r\تqy ~<Ͽ}ӨM~N.pvi5r0VFki=B8 U9KeUFbXks]]橫fR`]g_&9L \Wi#dP,@SgecV.]c @3f6M1O‹&)u>-+h {I\H̅:uJ^ w#;ӆz[~KP}eIr}o~A edm"UÞf8xBNpH$bĹwo+$Cѣl^]:^ J JuI>Yx8F'?L!:%DP#˥9Zij}%UPe!yӿrTWH}ZQ)M:Or&(G(YVʪbyS"+Wg/ Yi.,9gbw Jǡf#ʩ%Ռ\\7WsGyVG|4\N56,>j*Hwȫ˨tW_-L /}2jRr" *,RapcF@J`_Z*9m#PdZfPa&-š%|1ǚ;wwSBH21[[ Cm&TΩ=-aD`V/D R2^=\ ifn-GrnI1 )GR6E#9=nG뵣Bj &P%>vsFFUtBxilꋮ^5J Y!cl-@Fn #q/ow$JG=2gQ!h~ +,]M^m?5rػ_~Ž"<*qfˋc3AoiYI6O}_U]^mw+l raVDZ#,0G:0 mP2_NC̟9/E .\P 5z#{TKc)1c210J`GcD,Xte'i%`8!NZ B%<3hO v`1"g^`^+i`،Az#1[qV樧V?mXsY)ł)K⺤q9R:槱MQcb6^(KcYx&^ƨy,[L)#pr!`ez+TXd)KJ2s% \!*\J,4+#C}MU}C C]4ݙwUeθddRU=fkVl=֣w۬U6k"W29Ba:Ž cs-bq6gu̹ĺNס [:_>K:_p?oBS2aHVQ(Ь̰i h\46W#j-j|ֆPsܿ7@nfTjM'vV2"RYHeŔeVu,d+UB QT%,,MU)0? 'Ey[ AÆsUVN#{Avc6#8>.\jt$e9@z̠gyU`|0*"2O5CpE5n7^l^5%RV)G?#hNGvn{rζZ4ю,67& 11--+JK:2xYaȀ%i5_uܜxE‰yLI6<Ȑ Ǹ`0!#/d7Ȑ;T$^wGc~~xm c9Q82={w,Zh:It$e/6he|NTCn;`Kp2qc)?q=~*sNtc>?E+i=Eaoԡu{[q+|"y5귏ڜJnkV,e"BRʼ̄LiYUJ0s~^,-IViV1#urfqf>Ms0J m.H`OV&YR؎o߭ K ꁷ}Z$֔Œr1oH3WXU20:eXv:OeẸ;uy(c+ñ.Y)cZWsDs<~IL;9VQ( ѳYG16,3Z9 {iԱ`L-v:&CSYW%S.8qLqY}@H#Hfr1( YXHE^p6MCR~Z8PHL Mz)& c)u.i1 C* kTSJ`Kc{1!K 0Z0F`ɸQL#gFQ}kfP.j@2\p*deJXƑ."8>+ eB*sB($ \Tdt sLd>^ rNcVx\MB$^ 7MC0$VV'`* F?YKH {u v۹vLdc^)-P(K.Q906ZӲ|bQ~NF SprxmWM'$ ^LԆ2yzOmDPN&]wLc6%Ii0G0>Fޗ9,TFh YS2nLLަ:w m/q#Dto/Gq1I^"\rp B.;甴K J^|8 8Edֿ&)܈53Q-t=\-2RUǏVHL+r Nj)B9GͣBPu*uqi0( J|Yݑ@-TSQb6|t?,g^"g(!$ 1o\Ӛ$YxTL"K5I87Y=m$g}g զ5nt)Ӿׂ.ǒL\ $nM>&zj &㷟|aϯz>7bLQڃ>}x>{~~_ߧ s\ &uR3RY0D$O'YZ)%3,~e>7ȍi<ΚdQٛn@o5`m9JںǎyyR] 0jo|@ l̉Rj)S+Ssݠ6A\VֱDvqsUvU*)FFohvPsyG/j@*H]dh<[eȊF rO8$Кy~5~e,Є*ա4s37g;_ vZz3/ 4~C2΂wU`pG[>i"cү˂K1P?A\N0>͘>ߟi7N;? p7w3K}75i5n ,=@rmlV@+!SB龜z1b#Vc|*|oC+c t Hm63V;\Kh&(猄!EO$?Qbޟ+`Ķ@Nn L"B4H# lh Z|m:b9d Ш$Bua9K41$ 1KƦ!e,2;d忊|5ZJTRHT+Pcx 9<[Y~ J )״+cٮ'dRIhoK!`j;v^2?~z-۟!=f ۉCD켙S:*7܁aɅ6oT:HTH! ;g}}/zo>%4g C|r$~ga@;i45@F"^t P"%C+X31Dz &Կz.Oms{A8:9 M>yǺoy \0)kB `% {K/)b)9V7͒?%Uk';v2fIj#jKXu&#jdNSK 3\A3{kM;Ɗ~M]GɛגZ pSaH7Eud,9|q *1 b2Zoၤ١BR LHٓ-T{,.EqHc2B{ᅤƌ0!A>[^Beqb ["wY[Dp·ϴ s tifhةf&=Y0),sT#P;XU϶v|ι\j@.nb^̔JN 5'[Y{w4{; ; t*1e/씚Rq%,2Rpt5v]Y2^0/uiXޯ+x0KwIb{cs\uܛkcTJ, U i׶0b;p9V~.VѶv*T)lov4O]݇L+rۜzZm7ֺGy%+.K@3ı% b kJy㱶\^2Xf -Mx$mi'=6#N_籜"LZD:ͿYNVw~+'w&m F⣳h'Lnawi  h4/{=u;<߽˃qL:a0y80qHkKwgx"IN""(uOVA)oDuvVo]wm)r-&N]Fq2 Ox>tFxi od jIȴdGM::u%Q>xOոE"V*^=ϿLƉBjrBՅ )_uω l6?P˅ >F$eTA繷Ƭ[-g(d%';pRw܈uFu6 @.q{!;%;%?'ި$(łO6+tt^OʝNG7ͯ(+fJZ_Ȣ@<υ@!rW|zIt\|Z"˟L/]^c./JURTh`g5uo\۴8x|~KM2z]spADǖ(23y:1 9/8 2zXyR]!6m>iB^.c-<"z/fS/t4^.1y9SwBEuF9ů{Ej5t΄lol^˦ە.7 Xi_e3ɳ&>wl`i=q?)z a}n9ͯi?ˁ@* m MqsUG:ډ$5K4&(X%BCYNZ! 8zY=~:ȱV_;k!DQ_TZ/R}q7o@.#G,h0 |*ksgh-YԳ*,t}Dyz SI,c0EnDZE8pd10(>v3[e2H7υ8GibiK@BPVG Fh]i4S%]*9=RFNZ.# e1~~vGr$AT$蠃Pj'WQBVD됱):d*)aH*c8I~tA8R2HEctR@$KML4#&فWmҹ^WZynLu[r&k|⅏_-Η 뮧AϽ6`BɯQ7$] $gG;?^^'ws2q 95mXjy`{?R\ߝT9R^=B LEq'a$ NR&ndzUV+tN( ӝ0{ >kpGJ`y!ruyݹgt1!$x;{2ޓDV8{[J0xtxZ3#յs"6F4:%$>>,2" Th!7XL V|4{7?XMC]KO)8z! X{9ݽ3lؠlA82'fh]h~.wk 5>I|kYh?j9[4v^+2r*g * \F|zcKW8*@zq:>괷e*Fr`9+Uv k@j8.y؛b&%d&"Yd+yOp %Y wm͍)9[cp2y8'U & 7ZS9?`_ER ~,^ 9?~>Mf_?C깊3TUԬvpzF`ͯg:0ˣ;ث}-6oGcc3>w~0wL }_VuZf p= A5ϦNua C1žFaS u;e0}AvZUY"u2ʮX _U9j/YL-ʘjl,%i71=*{/͎ƮwG=`c\zt.S-:i=?sshIھ'ƣ.sLP~\Tp=ו!_oL7َ?&._қ%:^aFUX!\^}<VJ(@9eN{: aA](i!ϥg41܏O#Z!XL4 E\*zLY)-Cbh50BnÛ#O>\99f{&לd-Go(/f񃱣cE:K(ҫgo_-X;^3/bJ }d2hC&?3z] `pr l5K.ރ`N[J NQE6 q"V2,;>Yqp=X:o?84lʵRH˿c f$bH;e^Sb8E[k2 j.ljX/䊕1R0ZGANE  bGlsc  v@k0X}1c̣oDFfo5ZLEo9F2RusDT=Y8NSTь Rp+@3wR%VYPm`bJ`"ZJFQ_S7!#JTSf8("20<"I'm%M镀"=uEX#ȼPP,Tx1d`[KRa\=e SC%8<x#3mO|V2nvb>+&cҽZCbUR3ƓnS<¡k dHpFu-ѵr  :4}E`:jGH`? O 村:p<.Bࣦ8 Xv`!CS.Ӕ ɩ{)zq}rAd ~wIĜQ|]a\,E6c4S}ӄp"ZϦiѱSni -ƭL$QMHbd #M7O`éIBm rY C݆#SPv=˙VXfBZQ4RSIIJdśGBO .wER(] EP^a8AT2DF'ﻐFv2sx+i4k"U-1s!pIUMF$F/aV U<>/#R[6Nhu4@ju|ĞwLMeF*˄6'J2@%XKP8!]`3/gB[ A+1 SKV2fnFBIT0ɴCCʙck HEE>CKky ?pV:iT*|t;2i3ItBU% RI@YIJN*gv@򉌋 eN);ⴧ1I2Y9'"$"i6e+*ZW'*:4*/T&Ӊ!{ (re3͙(TƒP,(Ʋ'+DW6V& A-#΂uSD6iηMnn# ϷkD!H;N1t|?fO0QגgnBC } s,]y*S$N)lL#g74S&%sഃnNݔ˩r |^JRHndĽhRSmnC7zZtvS\#־7ebja@){2*zS"kYY3qhȦL~ɠD6p#WXϞ*gh:FSV}t:9Up]΁ԥmc ~^|;|dAZR5ekNMnp⌤b%3>0A_4 =?dwYh1DFNaqeŬEToge4 >o"O4Ureo F dPT{܌a/ZIC<8D\#hfTBBh7y-!΅eL'A)97[7fɤ >n⦿Txw[(>7cQ\pfVz2/jl].ϯ7U9 X2 G{i82a`QA ϙ50Jƻ'5{Ȁ_Vk5Gi>@Y"{s#d},.jR]NT)8x(,Ne4ַ1W*9rw,G/Sa9rKݯ.T qGc)dQ Rpqm]fH|`+gbU[Uʈuom޾BFݘxɦ m܍&%X^98`}4Kln 5&]r@H?w ֍z>)b)~Q)~K1b@eB eYGDIz:_"4o~dKw`pL5̤Mhg&JT'Th+T)|+tѼxmNxZ; rSM2r7ޠH~ʌ!MF磗 }p >YQ(7*e`uJ{H?NR~d2 llT-p*ej{~,o)yH(ZEF'l.):8 &vVrN'pPʵ_ dHCsuY@(LrMo_d1;#NYa3 N{TD4t-6𜶪_~VHzD#5C {¢%Ƕ{bNp!dnZd spo%$X\ U"vRd>m*8($-iVa@T՚nH/)mW_R+w)i>[<=!W.ۄ? vj!\Cawnv~~;rBT6q;m}n ?YcF[d!ꐱUVkC/Q6)$-9c3(R"kw;#jcs n|ġ("e0G:%wnQyyU䭖Ԫ*gctp>R\ 7Fav9NFtX%_^K/`u-M)3֞)N:r*mVODO1DxEE?["|T7FkVI+!28KokiѸ1^EV/Bʮm3Y×ٛn^_r0ݚ(wӬP}ƁO}[$29䳞|V644v;cg; =dR3Cwl򑌞{B?͹2 JQ|}5 Krnr0pDa_Ua*X1-".z Au濞/;t&Ćerg ꔾ(BĹGk5}\\Ө~L;P4v'z~s.%7*UTTw*J#:I&y!eF!N]"RJv I ?=611W)x@RJ7}Bi ju h-b4n>dW +PWzSOp"&(iJ7ш:7Ёk,_i43ț$C 6*!U +@E\vdRbl#`d0K'OQ]/#E9nL}y+bϻ$/Q޲!  (뢜{lQc6& |g\֘<%? o-lITә:Io:7lSNZVTrmwv(Yk.t-Fd'fMK1ts|YM4AS9X1T}PuRSXhLqGXeq ;F4D*L 1kMRXT'/N> _Y]hFڋ򓇝qVh}*Bhp{vP:o>QL^ B i0KEY2CK TV$:,cHnjΪq jL_.pRÁfrA)FHNSXH%lrKJ'ydx rcC4!v&H[jŨtky_T qĸֲ˵6K)%xMȭ~Y&(SZFb#ٶY@@W/[wQo,V@׏„(!1 o*FI'EVÅ0ozhjCQLj :3V2@oF5A_S(.fƐ(mZ1a[y:tda@Dǐ(*aL%Dg[ L&NŽћbAݒY-W@WRDxjO-f ϸHPz+ȃcg9vsK:)ޠqJ>JuvYkj}(ѱ{AZS7o "vu=%Io!iI栩Eѵ_|`Qu'u:~)BUծF6q'MtXeQhKht mfRu.%4u]nW\TηW4#wk*& kaJ L(40 )'\pxKhl'sBfH] %tVNs86vi5ez&,~j)Sf o͓SNvaSVVFj `RHw)>2h0j{URtIVR΁&u,9+8C%2c$؁:U'H oء`/p7V"jCM81T*sC֒TFUt,vCS1#u8S5wNsrĜܘAǛ$ iK݌ߌgC %t1fGo B1M[=0aLtv* y+Bb| YB25%7 >5h#-1%V~huG*߹4F)qkCwZT뵺Ы 1 %4^9l GUyTУ<a ȍQ2y*C5io%41Ӗ7pZ.Y-aON5[zle+v㙰ѡ, J0IQ+w\]}ϲG N9BbK h(ftׯJ1Y[4fhd10n7o|G\U7&#MkwkTp**,㢈،~Jzٗ+]Aw,q(syh(#pilS|O؂T'M$9(mPsǍikbtit'M'8bǣAT0|Լn'փ!i%D*rQY+w+n8i<~;Bƽ'̤ͤPonPS~J94ƴR}ңz |=C5(j.KLcJPn!G3BPCF̩idL4mZTdM7B>Uug(Vgs_oY_zCC)k|/MF7QbthMysgU] 3NZsV¼vsj;9x q q%n^|;@,ˌ|"YӊBČc ,\`*l J#H(,&AQVH_VdRh xb(F!b氣+i,jO0j ǦN:wiC.+{+iǻ-nC[W CJo2@*yq;cv&|9 xv$Go6G{7z_F2A}P2*&q52z.0fc֬.LL&K9k(5M:?o7<_<ԙ:I%(DbbIˊjAPƳ ;4%xCҬXL#Omar1 ,p3Fpf~~rӟG?Q :&qW[ QgOp{7L8 *A?t|TPT 6rIŷ6ݖ'rc1 P{SΗ'c]#d KiWڠlPz@Uɤ]j}U#]d_w6c)W-NBMÌ*xڹ/!̳`RR9}VgX)oPN1Cռ2) tqE@W8(F3Ӽ=g>fl[|ƌŢةʡSF*ڮzଚ;ؙZъ.M$>ћ8Bqw(Q`q=ڄz kO6"4q}fn+}n;YKgI.s{,Hɦm v6{?ݓUOSCm^;&ğMmG%2wfR1yj["0 甲>wVCϦ`lPUX0L;*f=VXĜPHekm\%4P# kwz;(|4z 𨷡ko' a|mhjH!#BaN'5J3όNElD&DMAeΪϦ0n_ݣ e&Ĥ!O"p4/m x{p[K\i?_/'l\ʍ'Ѥ?޳ @YNX8l<=? 9C?po`s9 }%gANv?N=Aljgm_؇<>iOGgw}kY@ǮqRKDIU#o " ECP+@Xǟ3.gdC_V,Dt:uۅ+ROH!]wE®3(Ŵ>977-Y;iҍ iK`BokE\ j,)!i5u J 8XJ36 DNCT+siٴ2}v1/"hlhЮ {#=4T>QQRZGI՜.)c1<Xh)IaZQmN1S9ѧ-kLl<η3EaO]mz/Ӌ1cJS"kXT1wnnZ{ng& x2N @,f=O`÷IQ4OoŬ%Ytbh`=2 ט͓)+"2e!k`FMONq;XrwEîpĆn)/oFɓ㕻8ђcrqhA#adȣrqp<' %rk\uhPx+QH\G' meW V=LHm_>O !O߇Ѹ??<-]ȖW0}U3]Sa=,&d5ݐLk̟b;VoBF>UW֏gkA27/ v~:}3w_1?ǽ\h`_Qr;4*fWoW&W-hb3UBb7鏐0[C%̦+mSqi\捔T @.YfU簚O ;ǿU^մ^8I{fvv;q̃Ru :4Ok3o9PJās# y,OČ`E1_t8 6:F)>9#gTY=(:12,1Z7n/ħQ|3 P80B{m] cJ;c3*SfYJJĥ+d28 B y)8}:̐""C6 J-1q),IM1f%:%)vKXgS\hgّ k. ΅FA Qè(cવӾ/yt!$yQL/ne*RpRlCѠ h} S 圐Rc}=nɱ"i]e`9A'S鞶hQ͡,A=ռ!=pHYwmˮWuuSuV?Ԃ_x޴9ŷ;EihgM,e H9l6q'c7<5G^#wh|2|g%!d'LFߡwhMO~FAlu蛿CK:OryV2Jh%'(J+2 loB3I mf|uU~#ёD7SwٿѦmID`  ,H*x n685B\٭-j ks2 ~ ~f6{ a0̇=-@R2[vbzXo\z>&]}|[ ,8} ,7(t,IU V"}V9p ׀IZqq0`IE݄[e8 VI@~OX>e6G}|4'½ZwE*)9ϝG'.AE/%:EOkyTmr^/Ρ2\7/-\LSnߦR=Uw1=qy~C2494.͠t+[E(AijsE}b -\ʚN.wln6[E+Ni'6 =)Ej{? ̹t΀p]׶oKk_F?BzZ z{&ij|o]W}Lc6L#~Z># mZt:78A>d8j-I;3Ƣ">Q[(+Gjl58w.t1r@Pa4q.EUɀg WH͊̈yCl6d9-*+)dY X\I_Xo۵5Yțԕ`.l[¶"iiQd`۳$ٖk`ƽ`[*d^FeOddf AXf2=F\1tFDCbrZ4i4 Z >H:Ʊ[gO ӣeMVg?xgHW$fLRJֲ!tlX9i#n<}ٯsr$Jz keKk~ph;(#\ԷKY/2+9 Q:בFV7([Y=_}/kk%2i,&#tiטf`y-e/F:@7:wC͹V^_Ԕ5yW&uӧCV+Ro[шSs w>4ǜJ>{%8SoM8ft}*cbzf0C'V:o G%*<su3-~6\}Lp*#-BT\AxofIq0HdghIGJ AZ$#: 6X#7t-oՈ0_+i8=OZ6fvb-{TO:VO_㣜6.jWXp 4Kb-9dyF)kn6 Áo¬B&j#@?rBID>J<~O ݊x"WZ m'}^IyY1#e,;+T3D!YCiutroB[Pk$R(wL;GȓEFM02OQĠ=큱"Q _at94:cx5t(>҂""\w1mƬ QY`ȸvAid,cW E.skw}XH@YZւ2mFVp='Jm-qOG&/olLܸ,)+RƗ:c|꛳ti3pUњ, z&:cyQGfgtS&;`G Jl7x4PzfL\]c{*g7wG 3Ct!d2k W큱+r[%a?1k#9?SUykx}ְ96PiީuP(q}$X [ X#@&w >ZPt`m CUiFGHr2{V@IineG{69XImQɺD碳 Ɠ*Ӛi 7Y>g|a툶V]V!ڏ17'\ 2J:4ÁI4Yͬ[ iC9-ZԎW 8ȹ91HdQHׁ| -<+. ?Il^ZGb"rFV^RU Quu7ePZkc7nZ5M`s{7"pZh㢫-ׯxގ{ JˎRߕZ:1ʃ*<* S C@ FT F'큱mh&Ft>wM. lC-4]ݗQBL/Yl# 혐t4*g{`lX)/ fFW<螥!ZV#^Nprb}{rO %qV% ev7KH\W ܃B lx/'ʸ$~xvt[ #w)d'm9\j Ff0V,f>lk8ktꞦUV-ў׆: ZJRYk`J~pa[/UK]&Qq# gYɻȳzK+5&a(~Z.7!ۛ;}n,5~:QcP>Kre,3aME418 =v3!Ǩ"=hA $}eT[7Jmoco>^_I%nh4{$ w[?b0Z ׋E_xm0ZFdi Bo02"˅?M Dgĭ^}k6 ymk r!+m!dK4s- %L͵ViVTf%Uўx0/@Lr=-2򈈻_O9[l7ҡ ,Y`!~E8˛b&k#+愡H6yɯ7 P]NN[n A<G!ƺ+B<8L ZxD/ N'&-lt.'T4BݮT:Dkd [DN*@D MR4XB&υd;>T_xaTm`bg}ږ i P8y:A?̊:O\Dr}m< q-2r2dcr1ǔhb<2 ZYxgtCV@9 $^DӎlJ "F+MUs#H p^I\x-NrMdG.nQ.J3%Sd:E>E&M" jDN,=0TtdߝfmMMˁ]&u:϶9u<⼂ r#:aq7 $)W  Ezj}\<+p-GnCgOיhKd%O9~tbᄖ{8J޼7?(bgrYbhcc(gGLwLi#Un^6O^iSh@94a= jZtݷ={b+TRid`A<\d/dNѤb8QZj%DRmcB/7CujaX/LnA*~r"^%LJ$.LS'!pEQD1-@'3mate\FmKsz[&U=GtdZlRU6]4نRNJٗRs"jٗ?OXT&B0/5Kkt纤tzOtNl@S,/KJ:O%ކ&5z4?ZK+i 罱 Vҥ iޫ>4VrO2BHoR§{DneMl vx KoڻזJoCNە/tb٦MڮWCfEFi/ْV)֜GjF;}%Vq `&eBfĢٻ*ğo-﫥/WpiC[-d. GzgB<{LW J}O]=@K|xMrǖZLF{X*ckF;=zlG|g>C^\j<'(jl@ |ߎ'wpffק}J;$- ,xu^%Gg'm|sm36:ɰ~; 7Y%ֹ`{ICfaI2su_M>uR>QPs]АNcNb[' PA1r{kj~>%Ñ0 mI-̾Ş#$Uxh[U$@( n]|K[}ým*-5RKڥeOɋ iˣyg'}-obm.ί6J #&AX;qjm8`9Vg_{@~?Ν.VSþ_O;|֍JÓ\-z:,?:B,8,ti7^CJc@><ƨϒQ̒@I)@13F}746I_?$utC?D쾫I>jD2 -c5b`RadlhcC5waw_l^]f؇,}߹p$(6X|j@^c1ּfCk.Y.,C .mo?cN `j9DZ8`X=fkZ>+xLǁH4O?=P!@L7VKލRNmF(i )M9 Cq^خ͘@Z|:Gu HoatTBb>Bv8A6}umiuh0vN xWaz ʇ`BZW }Ģ]GaE1=J-N/,J,Z]it==ƢQ,~7jmv`Be݇%4C؃Kh q"q8_ pX@w)5O-.~ w~XJtw6o}yUǼ~)ERŒckkȮ*9[SLQ9E#Zf@ae{8PF4lk pne2ܪ'2<1q=B'3x9[æikD@ F:3ҙͪmEvkxN#cP8(T;Fxlc(&x'9rБohrGYVo/YvP:M$JK.q 4d$VdI]d'1|_o~MOdAK"}N<NYDgBJhwfa5. ,* B;:ȞGFY(qyao#ū?4`qQOItS D(0e;Lb=Gmr5keP̮P1 ;Fʖ,[kDUxE=JD4F^^i uNCɥXD}dtjF K%B),!F9d嵗,OX&P@Ыc%zIJ I ,H`()4',64Ʒy3ZF6#ە,M0Е pL:8IkSzٲ ATt%DE&$(0mf{;Ϧ+RSQk#M$F*V}3l2"]D4@IاVM9RZ1)+aރʑM.(BCX8L'D6BK[okL 3ʙ l""$+IZ$SfvhszƳ?>- C̱Q.p'z;(|QP ;CE08嵁I$C ${"JC5@d吭wÄGjpX DU? :C]଀*^x9'q2JJW%>4F#{߰T#O߰QI?`NʰNNe`lȖBEY2&|NR`s +v3Z?)AluŽ>h /9{! s giqm~bc4kfݣ~tڝQ)9ԒCM-9ԒCMP'8H+ v(EW?W/!gÂ6Şeh*h|X F21If&,9W e,Df3$YJ\_P,r/RRwb ەs )%Ǘ55mo(2!-;EF.*AKɐGF`xI*1[!tLeڗr0D w 5nf'CN0TPdwP< JKc@BNk+h E5C# 5Nm2C~ʪlsMrw 캆6JPo,OR=''t5WYॐ)2YF/ȏ$f1NZr^]a0 aWMs/~q:T Fe-Ռx9?߼HJwXoOQo~yb"OM{Z?)z~4YLuv|qyo?9𧲙y__κB|vm1M̊55$x6~I=`38zaIUhv{v=l#ObvwjD'U)pGj|]ƴGh5r't*pZY'VϏKq93lJ(B&IEDB0Bѧ E (Slq9 lsa&wOg~lC8V(&X o$׼{&[]A*vnW)V fTS_ HJ$hm` 6Q\Kmq1˲l@!J[y5JѠΫchCVƷ^ZuK٣#Y3i^ՆRqZB%wV+s2(@%sȆw,/,Hir!ZEPs.ΫchCVƷ/5u )4!蚺K$KE i4 ]SeW3Tɋ& b6,ٟ[s~Ȑz6R.HC{ =ԯLUO(+ؽj s vגM8Jʎ/iI"}ǗWD-_qHD`XcU 3RLbŒf;ļ*˒2,u\Q7!oѽ$;V\7X`\|n:W]N?zz \?o\~֝IP/u,J7 N?uΖjyͧKcjSZ^QRUIL o\_6޹nf)b.,rA[lYksV[!׍LƮ^5=wm>:rwʪղ:oJ_._vݜx͇aq~l8q7u0[0,D~AϝqV .8e9jAmJv_Ҏ9&>}w,utqUJ~ ?5$HFQQ& ~'` 2KP0gisD` ]&2@hZp8;]Ҏ[uϚv<ѧcQqܴe9A '%%qBΜ[INSԺbS| q@SzfĮPewTK9X%!Ll^^JluXBc.ְBX2a#+j}s貫Nڱ[ru _`ѹ}X]f`:1zb9bCKs& ͉fiQfmV*9Q$F גM2F[8jmbd,"aO3;bʣjFU3򨚑GlQ 3.#:bK/O?=~o_;B$vD)!C]E ,] :TB%XNp):v!t["!Jd}N8˞~BS+OBtvxc YY{1v-B?xpV+C;:LfLcµb2))AJ5SNsf2:e"NT1V4D]Oe9Y=}fh-KDy$<q=20S5؏cR=; NԕSj8uW E)0<+L%~0eA [OA٠ jtfR(%D-5Qs*[7Ԡ<䩙o {-aFmqZQ[lז59H9otO& }<؉Ʋɗa'OkC'M_jpnF⹕{Kb@3%ZoĵB8?|{eDp䀯0BʎF~~ U'@/< ?F/R~]71[|ȱ\`/C-x]У'wITZjMmR=}zɭUSDeUܢ*QZ%Ih5~h<õJU*~2"o4_/*E]\Uku:-Y2KӏO/>u2QgNb7F)uRn龧DN_60|6Կvé6oS@uaE:3Y_]܈y0>v;|sv6?״xe++|3z%Ouyts[>g׊~o"G!Ωx{osCL@n<ϴI6!ī*Wѭ5%j@ߘ,2Ob f< }hQkWjwm >mxJVj7obS;F 7)g٭nv@G[n{ﶜ+Hed1``*3-,UNus~޷o.V3fn?N/wF:4 )LElЕqyU#QQ1jg)QN5eV/6>HǬ\ƌ~xUy耪:weJʹAFU Btf V=G9( rIXiV$N\w6zQq -Sv#.)@ow,"]H|8ݺX!LoX'Ψ`8˼ZDUjN%H-XZ1Rvю3cv-- Z  ETiU7P% ޕϧ(+-ڭ /*()VY$ʞ,T BbPJX"a 9!t/r \9$&\lNc!藸7\=+F?t~hIE/cRcl8r0:`(E0CSNsBYeя2,*9UZuBAǎB[*D_A5I9Kmd~//确 gGrqNb]nwTRI Ɍq$jp.Luf6 bn+^'1y,ֹ Pv㯸z"XlXV_퉜&-DΔ]fiF)i1ۇzZb2*|t,CN##gO5BD.IVy0JYZwƀ> y=yBO wnxxMMboЩhykr7g*ܤ+wtGmF$^c2L4x yjc ̒iys։1,;5+mV(dᔶ7Ddr#kd]C,y'Du˞*bQOu=x*If}ZhvctJEJ)R)!>fw򢤿Ѭay02;/NSQ@ࠣݫ?|yH!|"~~˾۷@}?6Po1 uŗBkG%)S,jZZKCl7 SD;̱yLU3bkRE&YW+٠ClW($(]nw!JͲ0GCBK\%Y0 >}\D 它OoBBaZ }0Cvr@4n;c-gGՠqEOwrp(vvD,wyt<h\7t}y'`"޲&AJ0rk3=W?%PΜ#TH+zfSi +@Iu 0OϺ} {a! %o2(X$c:yF`ZŠBI{wP>2^83 Tv|ԷjyY\hnZɳ¬lg[iIT.W&ܝT@JwQsS(#VBy֍R#eɡ-U4È_"'kOen ]Zg>toQ *?ϧ !c`O rl$kM)>ee:er&$'7;M]uiqo;H|B  `R@ZKnC:<0ĂSZjUK7XKlSH/>:z)֡=b됎i48A<ӟ Qz>v7=aƾt_:%Ny䅟7=b8fy4ݻS+6K+׮_w{aD]?6Yf͍с)S2'7l ٰ$CI!PMB{,,.kNJ$ 8^d́AOx[(QB{dv9'+!^1xL=X~kr񣐜IRkff"P"{moxΒL`x/_\s`"h4IyQ>utTq݉$o5ыFmU Z#z:k u:V6lf ҳO8SU؄M@Vf' Bߋd%;\Lk Yx@ط!GfuzP연䒊YV3HGnhT~ (TAsPBՙrjҒyh!s.B)RfYyCϥT6Q~e7ŨRz2{pD [A^S0чsRޘi0*ýub< 8hck0a-0ߡHZbJwR'[m}X(rw%D$,mffW(ed^l`IG}*ZuOҊԓ G͛TA=Ո(dQ_5ݣgx3_f8=uͮ ("hF &_6)uQ" ܘ#T .ImTC'si>xU蟫#-*uG: Tu}/'}*]/sznGvj2!_C+x+ݏU@FTA\&Ul  eKj5$ll;AR+A2<`B"@'3HLI5 Ͽgnհ6kǧ#\r_Wv q) Sl^tIj@gcIC9~a40? iggOﴹōy$޿p=l+f́<dp#K& ԫ{zgЫS5)\%RvѤYx&[̅ӉTV{Lk4)D 2)ݽenߟJ4RB ZR0MV& v#DžkW٬ C;6IGơs:bU@ D#k̹_b9xcNk#!pv=NbsFqy"Nttv#ȅk&! wKx*^%|ՈףbxR]ІJQtxv-I +nJ~.مQ]Fq-e7}A iD=3&.ɡ=Un}K86hC4{{͎z۠/ (=7cM}Oޱ)#ikG6k BhR*Po-7խgJQ'mXnoRą (Nh ؠ]3fKh‡fN0$J6hAEƐ4_ ڙ3߼yڼlE {]s5|}QZa.EmZTq%4"&1\8F?jALG)Ŝ}bVD=dTo`CIL#2$\L)ܝ=mК^r8g%w{{W*q٦e< @|ŝLԫҤ@Ɛ# b|oJݣTੲʈܾ7VRsm/C[dߘOj1 w5iEi8$٧M{2gYt*0Ef*d"X uH̝mJ5 21A_&X/@m҈44Դ&|)-VN" jhy*~\lB1(d!PCLGII'G8w"mV sQ3N;?0ݑB3}ycj#svq0t0% Q*9en&B끻XX5N`$Oi2dt{# ^Ho$c+bsC' r soL> c$[9U"%B;  vvQ!K>H,]: mX%cxF(]zda\2` P3SX] DعV0@)Ӧo8bɆD{o"Nd_t`U2X/8[g`^|/uO2Wd~ ozZo ]4\\, n *&l96 ;Bb OvZD0Y& Ĉ3瀨]g[z1,Q4j9]zNس15WrHNVftzc Ra J"In<^@#4EJK3&L @ úSaSX2ʺwH(%u%x(U 3ܛ%s_f)r 1}_ϖ;QWS/7H7*zo[+Rʁʔ:_" 2ʪf& k(.p{wA7oew_ 6A e$ZZh2Ju7,`P* nE#ts܂MEK-zv;CfUsrr5Ȣ4k K`scɩƲ G(}wqJQXd `˹ڎ9:Ca&98DTOmhlZQ346/)|R[$dGmp5BeQhD2%J9jSxkp,B RVF:# F{1:L!vz甿ccH ~ӛk'9,71\ dk,G~gd{/˥PG{P^&PDɍҙ,5_wREUYx]g) ',f<!*2܆F u/>{xq=FddXܣ'n9ȸ;qG5dm4#@3N[ߦWq7]H6F ;zf;]°s=l.9~lѥ8Ģ(x㹶K2ldĠ'rp&=*̦k|3ٰg25905BW?w𝶱ֶrF7d.TG>!*L(:TgXp_B` !3e޸4-`~cqnmmNI^5X4Vט5T,N}2gJU1E3$Y_KJ;JX3l w.Y:vp y/:27W;w.Y8̲͆d>D^CAY|ι8O'wTjAgR{&WM|# 0gwneޭW@U6*{[Ok&)gqLw-LqchߞkRn] ΐ7C,\ibVqNm⭚h!WԆT;¿Ze NWK%\m#UV^ p~f#dϜI-]xvR֜EW_GX~Y|>Tbܑ-1\*ܙ޻xcm%枈S "+B#lTIK 8+8K[ )Y\+LΧd:)IfU`ņ3gpf4ƔXkDk:U~)^ ݭUq-V͜Sc 7jvځjXYX3zY'&h)͹z4.OԷ@~~ț;NH&z7sh&m ^rt"I RAuzݞ\naRJ|g'҉ZqAt%- ,Qq=Ş/%;SUE,l+( Ŵ0ŞMKf:Zr7CIשzN2dl ]quNejKq@[aRv.sAjYn>/”$;H֜=sZ|2#$K3;I8 ˻q2/cl|$*2Ҳ no`bģ'N0xCuJ}w++CLY VSbVbemiCkzIiwxUYo9^!83#`b_Ԝsgʶ7ۢ BMe Oi O<.cUXS]:gÿ>xی+%m_qξu NQRBĊ!`ҍyֱZ~`(=>~q\qΧ-tX \JɅ7VoL 9='΄R[c{B%-#J]ڪWOg^Z STIX fZ@c4io[o nWPs79p TjC%$Sa;7}[6ݠi9^N۸:xkNNNgoӟZOVt62"vF[F+FfW} tX8KFFk{kS"LdT=ᦏs#& p&iPw _%h½naބ!z[ْZٶ;on|zkKa`_#}2XG0]La&%ƄP!*$kc S$D,:`w :a:UR.]\ECZ.8|]7jdF;kmtjCp{ JmPG6a,kp-,1F7bpy#UYL dFhp '\fw% MɉW9nՒ$Wax";%'gn+_g5nj}Ȧ`LP͙_vRh)/%vh3Y,Ud®Zlə5I9%ZDK}srJ}(> {Bsk8u~':O7Kv`ڒs5b'mlvo$19AWn&@;}ȰbAru#w5dkOK:ck{ &@ty۝G-S _ay^AJm}>R[jOR[ \=$0% #)& d>rb$cV!Ou5?uA9mvoSUſ@Anܴl9xh{4~ΎQQY `Gb['o&NP [ˊTb}pRcSMۥX+VAUl%Tb=;/>x6qK)x\ROlF09SYbR)u^b|ᷭJ.Td߮NvtV5t4Pu;jB膳.>bP GVy,=jݝJ#ӼOR|fn[һ S]聮eU=ET#0R!S1] 7}\4p|^ؿnQ*5דSg{{~Zr-#}_CV}١J?>@:hQ_nC}c9_owXkpww_ã8dw -6aX#Jn|f>+~l2t s!rN[Ͼ\g꼪tYsy˱󈗵ROa"v_)>]z\p@vH7m-6 uV@,I9]zO2!e<ǤtFMn0}Z5Ew*ع{:R3 c1JG.9> xQkx+.u.2a^V.iwd]4PzUV(xY*,"#KiRH7 Ɏ!Sx2RdPj#8GIFK )K>A|eʢhYkJO4r;1dL}DJme$h*S"j!vSۚ*B˻qKBR shrD"Vɺ<$f\E98ʱ"+ l jOu&~6G%eSbߗv{M1בBr8ZcWbP3r)k¤Եc.=ʵ(_*'-HR1B4M 4v#ofs:ԉ`؎=*SI>C#l1 4JMu+zh'׵ +l>1!n"\,D"Aɲ8bZs*&-bǂS ͏Ư=cShGc.QB5G'ވ(1;x@R ժx<8x|Vi98DpEk"(,nO,A8\ЍxyH\F1 e0i8Ӏ,Bϝ;2$ \uw)LP9%hqYޥ9ђnp,5 ^Q(E6 y&{*}V%00ap.mf}jEYŢfƫ,U K:34"#ǔ}` +J68,bM4 Ծ:5FІ61F R)^Ҕ‚kEEdĬv Kt4gAsPთvPm}5p\,W?Q-qGYw-.ʤRZ)U>|T@qq @@^ OS{՟Htb_D4%9w9s DC϶,^zA]./r1R&T,  !43 l )F&c>z`Bŷf?дPO&MPP+UJ:x|!j{a褖RZJt-%7p S :,XsϷX.`b@c*],N&;#HQ~Kм u5>ͥإUAy8y-X}+Oq|wt}oi&lSwZsr%@Lpfd`GX7a})Y8[u(Ҙz*`0貘#0*jZ)ߴ'2N\Sofjˬ!(}0Ij$xZU[͊d ʆQ˝6 K߶wW?Oym6cx*%Q#O #`"s\>(| D/$%&`k{$ 뮊M#OXfI4jk p2LZb8h\8'$VlHF]?k~@h0Gl W]Fif:vM?L vK.s UMzwhq0]F^nM≗:n9`'93-$‰uXTe*K#Bz , a5bAU-O%;i;ӃNpw;a2\}DWf6vrcog7zq: ސ7gW|YMūC;ywjç8Z]?hn;)i?MƤX[N.Hl|@12Ef<??)yɿ3}m'NU R#:h&c|#xkMqSHy"9rׄ#WתZ67,]pSuxɋ?ťmDLɭ_M@zX~5.YKQ˽\I&_JP) (f9?9'i I$G6yFj }w?_?ߴ36.}΍p}] ߾xqO׹6kJΎ"}~982)J)T2c J3$ဋ $: }}Zt"r%4WJı&MRۦ]Vj557&V֓]M ,oh69|s /,ZmLX4e nFy}Gt:Q<`Pnƭ8ܶ6g//}llF&yه?{_ Ec+Os9(',s WDd_[,VRq}YJP(:1[[U+ =r`wsMC^"Q2<]Dwz a†{%fqbIO'?tO*b9QPSݏpW91fShl1j oKKSU}Pbߞ%׈0YmyVvcTEII /m\ ("޽)WX"<+=7>7U 4ǹO܏oV .ٟ)VqM:N Z*!9aR>N Utш~PRs2|,feMSSÔv u]]^btA@_)^kZ^A<@`ݏժ SSy\x q^0aMc6یI31qi\0-PSm$s?v!6 nrF4ezo?e8hL-UZ)t&ISٚg}ʸ>y̋ wo&biM,Mh+iD6F䨇`+: Т\4kn2Pu j$Wy(&vK"]i*acr!=0W|Rȱ XP3u|xive:kk+KGlx JԽW=YOL;ٖN8G8ciʂ4+¶ EpQa1B qT0a rĕKS;ٖX1]׍}v(S]n9#khDӗ^^ E;%FI,A֙LG*_ ,FPHw ɰ& c$HKL oQ,לD"'; " ^bجFᒤ39yP\f8( \2؈ I͗$iX US=kQ#x~#Fkь>_*0LƝ4WK8\38W` '3$-!trUA\)nHv!ج`) cUبň[0"8xB{0O>H#fh&} -nH1ftґ)݋جF(v8\r$OH#aSb'S!ޣNb *(CLH0Y ZL'](桦 E8Z*xΞ*BXTvb [*ܔ2փWX>EX-=*Hq!R]xw]P.J}6ewa+q8i=Lqo;e Yc 7HpYCA`MxĚg+PyM"a9Sg2ɘL=f G]ӻ/c[bǽWF(:c4Q>@-Nl3ry6XUCT) $"&'D -:iu Q4Uob}:49OsV`aZ+PE1oG ]`ON8=RJĝFoF鱥j--$C´5Hk>k%)/CYcach3oc$҅sh t(̄6ʉT(E=`\-֙dUJLv n@coFRW+:.Ϯ<))SiܪzU> ^]$6,QbI@2D*g]/2GabHĉb)Fᇷ #?ZYÅGF_/3:wy,/Yy3 }j'ȬR%^%NȠP$×]o`1BD3.' yI'M}{''+;uM:m{h7y^୺ͺT!wwO9=Ch8%>H|Tgb.LHZWiPj0]e4PI; C\MoUն<Ӎch /24и i̎\^KG"hX,(IL)!&k홵jW;{1@,ew̃1'yo>/mx7WWcst?cwbimU(RdzMsA7!~6Z;W;G7M4oƈzT6_|q?&p݅ mڌ++/F7W@3tvLWZߚ !z޲Su2/l?綛@Ht ~,>W^dQ@ % adqil ڃS= 09] H .=͖֞-sE,᝗7[ Vz%1wZpp`〰M#ϽESyS_zmWza>cV9!1:z,r*fH яf|yx:F6^o6 j=yWg _Yu7 i'?{ %+YN}%gtm4a6ev9S8feشkD}{E|r@0v8{mvUKw#kk9jՇtUM ;R2YNG _aĜ O2q%-&ZXطղ* Hϫ ^QEӇ[9 Acbph$"# Qh- Q[ O4LD *Z$#R[.p$rq )M4-$ ep :#]:+Dkt_o";bIX$p(Ca" i SHE 3 ,8tbQUQ#[]bT_Gػ0i>#vzCwdM "цQ &Ѝk@l>*qI04cC$%a**_H]H/9rߕOjAݐ,埑g2 iZgg) "忴v{8w~oRj/[ŷ|yG^8m:|wmM%{YWmoK^]fZ1Kf^PA2+PWӊ*kLYT]tEv`X֒ʫ7dөs>؛sƂ₫/W)w A?عIΓe@EdZ^zb;zVZSt:C5Շn'~g;%l]nM( QI%%%_7V}dG~r2W;U48$!v؞HhJD"!U"E|olzK,sl so>ZO-Kf,]F1Ÿ+a~&73e8 &ǯ,VX"" XK#M92* DmYRS!V6*s3O3L Haz-NMO᧓kLJi^&(}sWv#l*TILgGr;: Gb)#Gyi(E./Gy qB>4wa4{ը+U¢c!c[zn.c75co3:0T;.R(EոWԨr:N1e("X_bN<-k@L!e+Ӎ+K&QD)3L*Ž+nb(8U[<ų񸉲+7>aD(c30?jiG50%0)0%VE8ZviGeS;V4o?.)Qp) 5h]JFse*'G ʝ`H7U=aLRsDrϻW_T?#]V~t6V ѱdap5U!{ψi{e`^LSO$RqhǃPEQx$ |k*XB*YJXI TpO @ik|HKLe/N~ TKB<U*QRce{tlON|_r=c$RxڏRh\&ra]fM Ԛ3kf"^G9!  #cF=f\_ө?+{,RN,]}y2u<{I~Sen* !iϦN<}:LAޙYl%,OGOgͮ`|}_\t70&#Ļ~\] |}c;797Me7|޾oz4ز9(|^nax_ _4+ݥN^٥ٯ\flw}e :y gn⹮Jh>j-&}?0p#sV.lyt`xץ$6  W+G:H"a^$55v;{'d9aIX#rNrmܑ9p luI?c69̨ 'k1\H}WuaVg1^lCm(qH2Bha:$qqQtAC3ag=v[H֏ ׯQm"f[DDݐoZD"oDwn>lΗ~ /l6tDzʧj{0W1´ (E;ȫH~-lQd"KӔ& IFj(XCęBIkdHy"SHe6\5f:x'Zj6ojRJpBKV75e`l4Fg.HRVиݎjp L|1C|?\[!јya``⋖ 4K; ĚS RzDVJU # ii8GI_~֏sn`q@ )/Vkd8 0YeDaazns0EB֍y1u"M>`@gIBĵ4mfRF(~ *yE: ".+ @/sF  y$ن. ݋5/|ӅŠouaA>0jl-+<2 $2<#ށKGt.d 92rEk2]@(yP\7<@J`xq9ak}NHoOk>&5/u])spCl\x~sEnq.ĻŔ>3WE2 DZڦHŘfS;)"+V`3V)ۯSx)sl7)FA|]n'+ swOM7O]N~bַ[ ],y.%YLOݺI6'7_JS^ptf7?\󭡅)R)u"7~Pؓ@`z^0BLŲb9Ȭ]Ga@~?rlPN4L~o@Hl"oY%; Oe,#Cڳ_2\\'uu),Qw]LyiƗÝT__BIz&éƛ*a(o'=@cv+`rs37 Yup3v6ȅOLD:4y5Ƥ؎ɟgiGd…,= ^2f$뭰5F2ּu$q5c;GWs1EsE0XSc(HGV(ɎȊ/Pc'~ zUc|bY&HgXy%|- {z*r=jdñ`VpNLqɾzo)%},)Xu E/ ̢Ԉ1gZڈ4tAxԊYp)A YӉy&=P$A#)~:\[Orq)eF lp ;FZ:őEPݎpb+I4P)?@s#AIiYO WDC^Nrp[^ل93>2o|FO~!RThKͽLC:ϵfQa>aFQad^Gfn&o6s[[,{_G\_o~_}_~Ӳw|3?;zόOlَ"z&/G_.Q]b9"Bi--L|,B$1A~ࠂ$hnr"ZoI|n>c*?g ryg_c^]&\yH)F3>dX ^0}J>khʩ7&;nFteү> *%帣Phjy*jWU.)l]Y5X-jc.`k8|ݯj\ TP;"ѧ=zԁGPk3wuZp*:/;a 7\ZҢH{Ke9 Q<4IơQ's35T59>/!q77^קQXz RϼZ#/ks*;-f ssleH_7L/@ ??AHmU  iuVjG8]9v_;no_gɉlFɔCʥd; +#p/2A)[ㄷAGY:qꀡA:F%JluIs .12x58x4;c1:G:A*F-} -{G$ǫq4XfѠ1P]@cA)57Gd+'~ʾpO֑",'g(s+P'\'zJ*,zG 9aumu0KWVc F"ztd*mƖwT$l;Vd#h'}*`} ;B`yR(Y3K3K_e x#b$u!pgtSX4\Tf!r!fo#{g1# |>P tl'ii$_2G5mGs$)܆}|$5l j6#%(,%cl;)7u;Ŭ2];Eԧ窥{+V|q<RQ-٫@W"89:D(m7 yŴwap"p/$%t^wbvM=vܢV2 >| !(2pIQ>'bWE1.7i)?7}HF\1.FdIQ )F%@8sC?0}2ɂͥuN#$F0v"k&CU=%C!C8)_ѺNc""< W^s#[qYʩvIA02Bғ˩kRTݓGLs:kĴ]Պ .;R*lR$LO IIh0 zd(F4(:^yLH &q&f7ǿ~ЛQe vOQo4{a(z&0:mEGRaQH\MT))r="'!{P3(œ `{ CEےI &$ef/#9bك v@M)K1 E͕~6\o-`Ń$fc4F%cY*?R»"2eXDY+!?Fج=˖: pǟ]%(6UڀȒ׎V FHFa*c%'R|o3t<ڶwDG!nq*h|D跂Y..zv&DQ S#͎B}t y1X_G^d(,zжSi0=DOZ~N[]W@|G$ORt.<=@>]gS0d, Rh f,E$9X2 bBFTcNEyD\3&=A0;Sn0_u'),E)шw1zE(jhтbi 0-.sZYmy'eT_dϖ]F#J5;=>'ȹ^&LY<5"hrE@DiU (ijKM Ym{g FH95FQ+v("FW+׳4!+]W\ 0O^mjpMxOo #]{򞵗o}t4ܸ&\|n/_G嗆W|/8vWY?qyχ?)]zE8px&͛Y q@FBkV*3"x^#RP75 Bi!m[5}xav y<5:cI ڒ&w4qѴ<1yY:Fj }+LÛ 3rLilpZg^c *Hr"[wJ[qσ˵nm7SK3}f"|uU.8kmF7@\׌[sJՒ%AoX,_+6x4>5QښHs=Do6g OU7@}5>D/oP.;`e+a'-urmBc]U E);X$O<ŮtKѭ3$@sP+xg+()S98@C<9skPMr2༉ ޼ӽ#m0ϑ)~M]Z{MM_{u"Yu^5 vnaV 6oS3R#΍iјvrvJ#~<=řdA vk=Kr}NRYVgp~ݛ^49V[n;Zsk+rk7sɞb>zr=}G7%j89#XP;72;=_{[JNhd-r|e.Fw 39m/LyZh .6LJlZNDžI1Q[O\=$qro~.=&^痋W1՘wסC뽜/r"gr|Ͱ`/?@o9] +:2Kp1p ӯ#k}SM~J]O w|}ow4V=1#%v4 ͇rz2#)z ='u`o95 bdzq{1؜Pys2Lpe9]3W_4?5 dœnFvs̷旼ٻ6n%WXuSW~ֻkoWnʅfyIY}!%/C g+)YF3snLwCѱ2rY@M3 ,s|f*gWlpwN ـ4Ⱌ !n+˪3j\&a9|b׼U$Bpҥ&e+nWDwQDvVDS Z)Yb$YAmG%]\p\P$i b*lINm.V?^cՍ{Xzϕ5jNp$i# "!*T t5j0NIuw:!v1.q"GقyXc`ihݎX:ȱVL qДK$_䓽u&!c,Nm%x~G-ޮShsVR0Z#R!BkP8m5dX46E303:g-k.jmńFu5lJe~x}/?^mj/8ހZB+F+tSz7iN\P'peR#x89:cc޻cO{@쾛js޸[TVbYbN&WU]c۾Nel7?QSJjJ+HLr ;' W H`8tP9yuClXƨy8jsԄD-gTYZ׿:4䉫hN1;lY7 u EurQźpvfbOnuhW,ucٙ Iu2QɺE0"n&[4䉫:%$x%wDvLxaxx-3:t}1*Λio2~(g {!ӫbǩZ|x]GїrճeO5 )wGE(+moMEEE 6Z0tYW>QZrfqAF0E9eo?[D DidQE7`&v: 'F;SԝCS:0N|¸̜.%HxG[.B+'WzQi{]1Qd}jj^/P*.n^ [\r\*\/qwz P_fNs,Z'P d]* TGb^ı >iDud PDTiYvzD4u2$vMkzgJAO\sF5YưA p 0E kU^4rbmJcL1iyspZTSC0B !Wh,8DBi `3R$.PrIͰ!#Ci>G  @1&Rg(f N1@1xa1, 7fZ87,'P d|fWMثQ{_LNfbTsJe-KSmΞ,4I- ACf·mxehr1HQgTn93S~6fݲMnuhW,"ut,4}F^8ͥVf jА':Eu Pl3gd#M$P:Og W#0kЄ޽a9H Ԅ*b$ D WpW:NY[ګA1nl+fJ% 'ݐޚ(kIڎΆkQL5Jf5ju~qF8#`+7Y `9f Z8Rha91 c ׁ3X N?U8{0gܦ#JL|;u[s%#r'70ͥN|WM?ӓ_|U]%VaKpp><#`c ,E$2+HU1HE +DA T spΨwK=~IN]^ݸF])#[m"X`qF^|v3P@UhF.(5Q3@0_mϛA&Z3xNfԉuhW,bu#-}F֞81-huCCuc jfR&IϨbkκe TZ0LFҒ'T#,M, "%iwVfP;kwVAx [4C)w#[w#ydKӶǼ9j`d-h?<;)P(j%m~=쪰AeCHJ;g7IF:EОI714mA&, x)ݭ^$Gcў`Ǿ6z1ߣMBD! JlT/jr"2Ҡ KB@mDx<٢?qRu&,̛\O#,(bE%2$L`U R.hf`=2HbA]"j-!; -u4Da$;ETAdDZ'(IvHͭ5i.zwh^dO6XYDDJB|D+ as+>e8t co RO&33.GQz(:3.˯`=4o,C0o )T RG gBJFճg@e74${-: HHn`*l>l+?95J Ga]݉bjo8`dxhˣF2>wD z~|aPW =ə9wə{W?~7\#^a塐۳$cFöKP7h80hF aXB?m /@@$ϾfMGXs Asr@E yAF}X 'hP|ZH4:nu0* z4GMlLV-3&vP}p%fM@|n^WrqrVa˭`*\~pgXk. ?9⋨p7OIzA$>m߸N|'ap> Q;<3M}Bosa- J_DT 7dȮ  $$%g4Whh^GQ)4>rؼǝ8nX&s-=Nҵg@g'v' bxvbZpb$!yylŶz$s D9YwHѦ#$@KH:W'<0(d@e 0)scGLʟE] 'v`l Iw]`LǶb\2wwG6wOY߭u4sfagG6z̚)_a,Q=|wkm錧gON%=mltvm6HSne6\m оD^l|&(ҘbLRR#KbX##U`VZB_"'z,c!i˝*@ Q3_thi!)EzB_IK` F"FE3JJPS˵!Lb°thbpp.)l@f_{eaa$Ƴ`oƛUT9U\Th$Х!Vx|u )dEH_I 9jQ.0LN,CMA]P3EsJA]P %09 )ӡӷIOxCZ{8m*2 ^i0{> Jc'z=O:NHRKWS0r/,P8t7i2_fŠp EL Ԡd[Usn=Osy[eDjsH7.dJ$[M! ߭* N9hřYukʈn琐o\D+ɔ]lk7ZݪҠTv\> xPs[eDjsH7.d8VEDeS*#ZTCBq$SD~/d[v*TAF9J.5ݪ"Z8v Et](*'( %!Ygf%(%=F&lyWW 5=EF cCrL04'|`0]}̹~ꈏƚYU7&u TK|iL>fk `1<( 2$D DJ1,6!A5Eað#d]CܤN~a*n%ɕL}oH@r$\PV  haCU~= \*†tA\-usU'?r,R^O@t)0e8]Ijfm]|/מ[-ʑ .}d` <1.f.PB98t)84,uM{]8+a'dQnn2so}Jxh .~AQ}AoK\N6yMIK=$ޮXfu=EE2L\ƽR,RH\ܳ{czIyY{"qti<x2.&RoyFFAm~9O&ݩ{7@JwV7Dn>$%րIC M4K fϸ!l(Pu7{6ې V~0܄2""q R %#"_ZPZgcYjL6X(nLKC s 0kL3X#g{{Γu풑ʊHۀt"[ m BS \;ǁ ˰$7yLFːzm@J֑ݍ["Mx wUqnҖ $rW >9_D D>M_3Ը0dXIBgpt(86r43"x`Tu *Q:t s0 TEv(͎yJůUhswOvo6+)h2B9X&x()P$ L*PĥoP|8⩛N@jdF*/W>m&I}ojɥ4*ER^UHfē$ɝ&{ofL֑]C%6kX*#%]`AȔN3SJAƣJ[;$,∓VFuF!@fJ3jnR (NȊiYbC jM*d0CtJ+I FԊPh6 `'5Z=mĄ\Fia1\?F>xweպWoa|yͣO6UD ӑW#Kf/4ի!rC/5[ؾ?40q:^B,g _Sa6o ;B'o9̪jEۓqvԡ˫Χ3e͉Qw~ݨݔ~躑YriRκ?gKFn 3nJl H]a)K_sy嬰]UGz gR~Q҇Stx;}d=ԅ,*BRb,ehwQNXqd5lNjKJ/:y%B E,NU|4K>B !>y2RJOxٗX:8Y:_S*y K.q,[u .I }Z)/\#r/5EڎyC/{6>d9fZtUs'GE[LW7`ɥywkE?*M" jﺐ_Hkj!x]MM(v|N) !zbk~=v2{}D9;PJt=N4>utӌO8_,k4v<=DuvgiEք*5|O^XZ-Wp&o J:l%LhupNae2q99ΎnFб<h-5Yx t;`}y7xQ769L~}Ĥ}C^'| &ϓ;~4}٧C1u`}3Fc`ŧCN@I{>.(<N(YLL?dhB%j89IQZKkGv'<`zNgta,uP?w9 x,v͆XN׮>;¡e/nX<&_sA?<%T}ɇ4 =nIk?<׊}{`FxR$?_6ɰfow<=Bsd!9>4cL /_|~{!Nxirȳ DꦏNo-j- fE3&݃ɞEIܑMS”YА")IOw"d]CܰKŏ~ygY+$RM=~澂FXP4F3t?GH mŁeX\4ָ"ݫ81I`9ќ7N&۫:S3 U#hYݹ֘Q$в1*q ^㨻:g=iTrw&f+mHOa xd0ɗS:LR>fM$JխYCAؒ,SAlϓBd6:;iLTG>Wӳ;[)ad҅mC~[q:Y6<yyq ~#AD^7Ia@kUOF uzYxGT#Υ)hg[BVKܟ.bej\i GxXHlC !+˳s3* ߕ(.txvMՇ&A^f=J&)fdnYTlȈ#r}Qx^,E\yԂ)֟#ԖؼZVpMe4sV:Թ a A,+{cs>o0uO&mj0 QYɠ_T  ?Q#39zX! \L.o}!CrKJ$oxC8[a8<:P ض}I+R/b;'!5+@IK\鬅WJѐ` >~AF7@BĄ*-,gN"3X$ K='^a&=lbC?G^#;A%AVg"6o.[T9Q6Lۓ@ɉ艎[ڳ.A,؁ FMϙn9@$}SFIʀXMs?  hob [%?)&b'MmmaoRCz1&PM]],zIl"t8yy(eyZz•ƃ}} l2ͧh@琂 y?pN8c׵ ꣧H6IJ9?:c(cGx:K3u*!LcsR[UCvA0f(LFTFل7ޟnuK0#~| !{7ԿjX[G,*b5+PMcQbTR(<>~@ PFlW H J"T,}2qRȔ#PpYV 3QQΝ`1suE/Ui`6gwҘrIufNMNw˞,]^y&G$'!bHBSFn}ʷ#bhk#5-XGv TM6doxD˸opܐcHZW׾"hiV>-n"OƵ#^6 a*s8&n\ZU~2T'6k@MmӠ[Ftif8'D#ա3{v5W@jD,(wㄐacx %Qy'}7&4 %fxZΪWG' r?8n,l⇞8p|ƌp 8^Ƙ^-x퀔0YOE~>n]N]<4M04jdjkԇ-bj qTIDfP$Ѕwe5+(9'r\tZ}ݣ*9(bD6!& Uw˺ [ma"> aUSTvӸ,͜$[G zo.˗rcstG|PK?(Qv?pDhvg=904ԅ>@.\ 'nLԒ|[}uʴn]" I}| xs0PL)v@}, ?֫O!&jO;>I^(Hr[[ arY$P+_{{{bfnQG~~Du>s_n<ׅyaS M4]拻YOy >gN7U-Uڣ^zw`)S0=;A7F1 6)=Jvd3pWf)MHn!K1Y䵼0%irfURΔO3IP vrA,.(i* u;{ɅAO7c@W%'lMl?MmЖ H'2iLFig_ow0,R 7$餤c,D\"#ɜ.$fw?S,0`W MO=#ֿ?GQ M'C_aCYzz~]ܞdgU:yR?Űv58Q@ y.áG1b(<֙Pp^C"u0$r,?=: *ԔFh²3o>r=i^mH(uZN(>n;o6̦(P% ŃJ~x & w늚L5p7$>/}=|e=.@ߨ?Eεǡ@ꍄȈHqw G'S+ԯ5:_|>Z`@?[dhgz UBז)n8Zz:YY6d GUn{gӊGUe[4ϔg>~[v!6j=덲-LgfM4[zDyи|[e#G'}Y9'or,V,na6wo\!}z  O@ns(1$͊y:OTqC'$[.ޘ=8.>x]bmˆ=AS@[V%; ֹw ![1/Y;/}sO?/=qR9Ew`Z$+4ֽRJܺd_tS_vYi]“=ɖݜ<9ҘZe+2ƴK  !¤9qml\IAkT!Kr;,M6ka^%bzk vɕ#godczgKWqQn_Y.D"{mC*%Lo)`ÚR;fhO-m@@VS+\$QN+i ~$gWzg"8L Ӣ+f7 @(1&p)á728@I#ß{{{~$}$^$ XUlthury4" 9o=n`ȑ1ADz喜r՚}xpa/*'z٭Η^^4&KH &,Fq 9Jk OŕⲸo~f៯@}j P^O++G>v' ]819iHHHG0_ED ./ʩBH8\m!"F/=?dRAyA)A[ 7I-r_Q|"hBꥹyw9\U1*LR0O`LtC >XN/`0[dEv;El_Y6Oonȯ7J`}Û9 hS?(ZU.RɊhܙz¨ +lS?Dj3w~lVKNQ+0~ Re?lK{F@􏓣'JmDԻ; we-bw {r?V_zhЇ b{SJQgDLCfhFw|pFI-|P Gn;ihf\OT}^VcPۿ#AG0ZQ; HP(y8d`130p9C0KIR}N$c'pJ9& *W#ۂA&ґ jM%3Vq0,_U,xVI4O 1d:a^n6퀝s!i?N7ЗP;x9QQ~eEIRځo ٞ㕫Ho~ cf4Gx:K3-$0si'2ORTуp[ 5"ʙޣsxRg2׹At0,Xtꌣ5SE!ΉʾԵ,%egS*7;}9o''V/^d!knDŽ]UG4ʫ.4ԅ>@.\ 'Թkpۉj)^E.EEnnTMإդ]d)e_fTˍ_iuF_,J39w}|,$+|QCB38E޵6mcw:[ʸ_<lwiKj@Ȓ*IHY$b4D89p'0iKn$k|5M5C^@㲟^-On `^Klū Η[Z|a]b>,D-bA9>`2]X8~ց=&ݼhyp|{n/OfiXg| :\l0OdwYn4o0ѕ-oQ86-߹dG{m̃L5Y 7ѓm I2Tɽ# i5Ncv cV0;7RlNFڰ!=VCj2JxdKulA4P6729 +}|GUBQrZo&ު27atJ\f0춸ipBOwm?G/j`'huZCiN g[F#<@2-vOzY]lh M :jk}#6PrWmNf OiFoS-|KFհuQ45nTH4AnT4'4mL| g:чK;w>^o eܱ|([]f"ޫV6N0 ]wn %1Z^< Ef3纫6;E#%cVvVa#jDim3<`Wqx{3rFlfۓHZ3 FboLɐa)Dq(HD40Mƈ!s@}У "z?/=C2^wWrPފΧle 캝M=,%;|)$.6w ;4A</N%l3=q,\,2 ]VmSŬnl}QG{s43*Oi=3)A Ґ C cE*G%rP6mg'W21"K TB&#BNDIj ASA!"M&5'#a CX%eye?E(gੌ(20+#  U nl"AJ8*$ZPDlՄ<$EءvXْ̲ V<&Z.q`bl&ljib%E(Ҙ4y}봡dbXcr'pƻL^]~pJI]5cE]}n;dsk Ҝ^+5wT{Gjk9i^V4wp_^w.Q*u%~Ow&z'h*7L>bX>:rsWVNW_nvEHaOպb6oiNDkj)rkb xOby7sZ2ԞӗH!hc::=s.z+vQsٮ*nq>jv7?2/9utv梨JVI+F@jPBFŧwrkT9x+co3$⸆dk7..$c@b B2AaHFP[\LUZH5 @rulɝC2ӭXލC2r$YOƖ|ޠrQ%԰7h+ѨhT-PQ|GJBlzP؇uA+sGp u=W[vR!:nՆsp7ba8oGF`SXc(:Fqb JK R<8Q(bC`N$ ^cka+x(. o)(3\#U*@ʼ$(q%c2 f@cMXP:I ȋ:u):sՁ YoU\;\yotsB^.f7%^3 ݥV?Aa}8]H&#kc-0EGW}֜+$)IEHCBڃP$Cͦe&\;vi%;|_ʁ`E*vk'Ib%\jRhP&Di,"5OFTͥR:u9Rw3^zJ2ws<RI ZNS<_ ?1?@/qj*w< ,/ F&a{0Λ4&kX196 (1:!α79 Z9n^hԤ`PО"FMs<)Ц`;?A:G:!9j/7Q3 QL.0wr9Mz`9G)jI.(PO@wWB$0Ct%W~d*Duٽ>T3V=mr `}u͍Wb[E(J=a2G^1"KgX+0 ]w+ UfG5Uӗ6Nrն0ZNO ^' C1. ~>@!߯bdWZw+?,5,(+`1zc=|/B3Ю:C |4 &\HA81flXC"1$PHj Q . ~Ҽ0_U)L"0 | Vcfaxwz[ix7cѼaH햊|gn38%Lo YV j"SBSG kU)g Ƒ JJ+Ȧ|u RF3t0x8N41W曈  ,^:ƒoKf7Aj6,+*x|9hb\Nc$,1HVRkc~fP ",%1 iE, ^}?b!lӪH#5I@$)hTNK@vвfЦO\YҍlHsZ=$yT׼2o|5nM$ߢۛ/ ,W2_<2]_ X`Xp` _Moc__oapWZ,_UqJ?=L&yo^ep2 "ߒv~Zr_:`BrڤY'[UcVRe& 0#"\z~'f,,YadIUad3 zR k/V{3t`5 z 9퀜1 3Kh7618x~f0[3LlTk4Y{yVatE_x#lS(k13=`d~cɮw<razvq\Ό^!`WKqyxyg)xl?q[Ot\Luaa9a4M>_{(Y \X%)hd=gZQ{~@eOA$(f> NB8 %2AZwWI/˜ s4'c no51@HVSNB@ܘj{pGS$Sywc]+ڸGCBQHz9pD{Zo&wKj^&^5hUP )5 zu{AYw.6T*WFxDh|ʿn.@U-_NPhmfơi}Wt(w)8ZQx;=1[yΙpz0-eB ⴪F}I]ry?jU8䫨V].ګ`/kVC`zPbe17:NHdf >ˮ0 ĺ+#24::A_IH"3W)$iVm%"UkvK5Ը` u4'<=C !+E3D"B>NV:`0Z4K3'Dkg/Y 9OT D TXb=yL ;:>'!N~v i6 "\PvUben'OQ|ⳒKQ8c݉!3FBXw͐8܇vp>,"CfSIQLS(B1)MONH"5B$1Q Ӏ83d6Őu0GSglF]83.˃ g\ 8E(#HpH ÔG\H"B!-dS˽KUt,/AǞVC:a<W٬(gmp)'3}OW,-c+I&!L 1 Ic`m1@۝{c \-ϗ?i/)g?ǮVט (Tta.\?w8ȦBɳq($Qp C)a]DB h"bGyp7],V78Z>7?^$a1 8TL01IZc^ 0B8j 2ٵ| C}?[~fEaY7x{&8{Q83Kg4O'hB(iw>5AP'e`pU4,ESnNonóYRnUEV> n;WbT &hmqm H A7JE܋Z'UGaꨑS39'^L&UEHY=vΆFL;( ߋzbN1VHOs١Ml+Y@uq˫+"ic"([BJ2okM/]xJJ!ҮU^7T(6<@GڦmB຺JGhtm5ݳ¨!/>|;zt-ԹJeP 5zKc-EX{@X0Ou׊B?xBQ@4(&J)Lp!$JR+pIHF5Jrލe 5>4} ?2'`HQoK)3Zzцx~B3g(:|^H<B8D+0Af2!LXȹY T1[, 0A`PhV-I/%<1B6rB4JJzI-ۇ=Vzyhn/~jV#~8b `DmCHP u4؛Okg/K\Œ=ڐąXn!< gGn+\UonE;$?YGMv)RG?aFoQ ~#ؘ\"NDJW :.gOFY! 1r~`jk{fX{vzV'oޕDPnxlzѐ}ק|^8jar` 'kR>}Itᦷ+-)TNy}m&}sv}{Y0gubH&z ݺr(u: Rh-f@,Tgo}JtIdi.3PS_t ɨQ &1D:. vK֧x] 0ʕp4`#r2,Krb6%a@$2Ƀ7% :Cg|֚4K駔Vaʓ]¡ |9^\@hP^RV\"T=`\PIZU\}%T\jvAUX [K3뤪2k䇵ĸ y?$H)V2%0R7+{[̅>T(1lT-sa [l:z1-h㾸Hˎ^F˕xޡQɡˮ(U zMPByr .)0:D(m, $%l#5O\n%],>) Dᝰ>3VJuZ6ioWR>ZI\dybVzH22TG$a:Im.0/!찙Z2@y2l Wy5:|L8S58Lκُ4IoUCYr87eYֶ"Wl8FBo25;)^ݠO.n˄s3 Hދϓ[ 3 w6uQkb51SQcL]i3SdLp6Vw]Lj=zvՊ]`EXXW(G:OCziVL 5To+aT@Fw.Q@ 1r[࣯Y*f?Γ%юӇ.GؐFy͡YIYh׊VDCyKa/k^)}h$]}G3]Ws4ٜq1|{LQIZ1s6LJv9"XU#}͂9H]DV9O3&LM>22X AR{61AsiLmTRg΁UN'MuZR-eKDS1V H#߭Ɩ_Ŗ1| n!HU-*DQ88!H4xV@3\nxJ3i~[e.%4rJtT]|;>٥Sۇա05uwDtF0߾h )T) jdBk DltFgTU"8GPP%L;WW5%WWJ{'p-1<|3Ah+)$*x(fR&ՒG"12N9.km"?quI9;b5?x' W28A579b)KɃaבRh\!JNMip%!6'-27q%t3mޖt{}u9ˤ6j EV ]ÆCK Z`ZT᷊2*TW7+Ru~;jjeP 5uط>Cէ~@h4k8aZZ@6m7 mݪ'Zj@htߺFȄ֭V%Щuu YꉖZ>!Z)`oߪbuLv' JƜݬlZ8"KpL81gp@BŨ#c:EzBC ns$"C59ɅbITPФkaDnAq7Gw_yT`;ɾun2(NmmۄꔏW=RB8D0Ȉu6`jeP>X)O4C[ꉖZ>!Z)bEQ`f?]lȧA khcDft}BB8Dk0DG\]ۣe-@t1*ebX* (UYcq4;,4hG6" T0r踓P"r4I` 9+L"p/U(i)UD5$<78B3:{8Ʉ_ JKmm W=TE_y”QIA tjhc&PfꉖZ>!Z)B Jtq7o/}|?\ .]\$YxAK??~W,kF/no&1 t v;_v{wVοٻ6dW,r~vFg_CtHP$g~8$bgl9-3]_Uץ/U!̆nHw/34ıW0@1|K10fƷn 5N)NE{nԒ[߿~]7S#8;< dѿ7#suѣB1 =IPnF{:0jP}.sW>ڴ_lp;˫]!R&EUj1x(B%S Si=HuOB KhTyU+bۊ*d8Ry6eޚ)E91 ,}41Od쳤Y奋gN7OU^YEMx^Zs2ϋBXR*ג_8u59(aNÖN|ሣs.&xlIJEc##I/&Zid6'&%ՠ[1P^M@ؚ%0S+}9# Dž2I^jsŜr 5gV:|-5_o4PPz-zBy `TEl\\wSY{-x:;.Mtog~w_,x1YWcs0˻Eo~:>]x|?_[ś-7F71%@j2}fy8d:,镒3c11*KTJrdO{w/0:Oh3إ Zgף'_0Ev PT(:[..ŜϟEi/x%?52Dn˷oP ]Z2%^[򿈛:hq; aT2 JUп@r/Y}F%~xQytKyqby?ݿe "/TQ3[D Gb1nQ3]Y-!WCo݆Ť3N;bEt>p`V]_*Uu'+< $-4A<Ÿiǭ?E|L M6L|SǴ0q;OBK '1| '*qpMt|`~i~x:g4`3xx¨;xjlO>VXU``=Q>oiF.IsNr_[d$[d)7nLyk偖':bY* /lƗfF* }]SO.;0b˖؉j<\󋧁sN;2 A Ͻe h M) Tikg}3&֜H/̈'XCPW*"vˠnG v~4kA8sgI%g8bC(3)2 /'\ d62ΌJL=h~!U:bcB2 h5*OE2bH4$FTsҒ b!1f-1yb._W"aT!IP$LksRedGF)8#km ([T^] f_8pĂMgʩL&HOBs[.t1 ǞnNoOH~G1gOϟQ>(G̘Cx\rv.KMNR.dG+NPl{I_V„KE*垠byv^A$7gP L>-iھNgpHL[ Id'kxu\@~~,:b8E"˜:J2gȆ,SgD,ͰO3xƿƁQn\;o (-qj@Sg xAd(CZ%vt$5"H9VdJC@iT kaJ[D2!1~BU"sK0P6  ܡXLa|>C ?W+P6/8)z.g/?}hEc]SNYҫϿupa.AD$HuEW]|5-ps;7R{y84{9p|1Vs អ`xϋ d5{LVk{}< 6ɕnI{ -k2N3a(!L2S~57♣ԁU1Q%0_N껜H)l) wbmqZԅaoV %Ts>KU9p^@<Նs 0K3.\fKpY*āYL) y'^{0T j5%&H?N~b83Gr31[Ϻ M4gZ]sJ6tMwW\{??t@o0dD2/ <.|*f:$OtpV&əǿ]2@tftƳy(L7%,5*ц@:BVuٰCM} 9(0lp{o&qҾ\ w&Oy |2 .lqk``Ē 2ӏ;FEXk S5GW5|8kX~7d5k>|7d0{`9=K!67z ~v1eL[2ܳY!;ozF7^pq;-كJβof'kqza]l,ͰZ^m=vrmhyJ?Yrok4WBgX<,qax|V0~HXƯ p/ʟ>9#ˇqrZ.2s5'ݠ٠Yk* jŵѓ3d OF)%.Hþ*LO2˥̌G8pz} wk!j&:b~T3- K}9ypPU-e i[ׇaTp0I=T)^4=WFԟom<-=±D;˩3˷eD_ބIgB̅6KT0aʅt`ImF=͕ ++~Q.ʥ )(fTreMI!G |EA_3*܈ ?3I\hUcXUn{aԷӯU(gxxa1Lc9jULDqi+gAxu7,dc]!1w+VzvQٓbǜ,i7UxmLrI؍Tqf喗 N>!&O(R,Stg_^~,K'uY|*?ίRtT%d<mH؟$fvk CWKZ8.bа3Q3%fXBHpwe׵D{JB[[0tCQei(sL0]l!9K-H+YY)b٦.퐽կ>{vaR"ֻ뀘+[aN_Xkƈl [v@=^Y(V(RyXx̂1F[jZn @jؾxQ3V-5)FT:l[|XQ+J24I}XJ'C_ *驝*Iu_49Ŕny CR| =v2S?ZܾFs7SJ\?_-N.W?a D pP>TF+5rX_FZƀbQ.J4]WxF x'p>[Ǣ|M2nqSeuS\`kfCəQx̂#(XDHsRSQf8 ߠތNo1MyYjGoF8C 4&#@I|hR6vɝ'dZ࠸lC~hTǺU2F"AC{`# Biu<) ia0qE%r { ms̩3: A!·]vMk8utcāa\RD G>XxJ L#cj)FKQ@ ױd kկf]+qS"%iI3 N0_jŤJQ<.¢WɶA㗒K\ay0c(LjIDTe$& dlɃR by]rWZ"ꭲ`8PN`'%amK$C1k1jSe]|.Ais)VSvM|`eG{Yt+AuT{XS-֛H"&D^e%1c Kl7 PP Ps]ׂZڛv4l-)'{ŏ`?'qT0=DDs( "~i$DE8`{vCQbu#5ŏX^͛^&V'1 ׂΣ,.7 )S1Xi13[M|E!DƔ/IJ:612['L 0#ͥۼ4)..[LdY?8h`/E k4Hɋ2$ց*%Hؾ}#1hNhIt刲RzAk؞eՁ`wcҫl=lU\ԙ2DogZ4yHHVt]9̡P ݲn>tCTbZ o tA+ -C4 thS˶ۦ@`7- ,r%%l'lܤNd( ?MN}UIqNd, -UNd. ?r%ۂiU엃c<*&OJBҔ[Qim]Jo ڃT\ʉFn忊OW\HL8S@iK)XGbj$N$z8/~=50+%4W+a3;͋qv]1wu-6uP8N!>f6c(؟4PD4<۫}h_E ۩z wE,"(Nf #xA Y@^Ej^h6o?Xتix \J̮r^߅; R0OXEtN'zR mo=o%IUsEU5d}?{AJsT)zjܢ^yuf]Y7T]Oݮ;}7.;t OGKvu{Y>|{ژ+m~i?o#n}iJE_/7|ݵC6,jfLh"j+n#* 3SkIna\B,BcLzKΌwtSa|c ?t}LeIӯ ` +R΋;UZod+*;uZ]ﴘ!Ol4`W?8E&)nq]¢][^X$'|X=ZhMyW^ygŶf]=I^t6qiZYțӴ"'Lu+ONeQz沖$YQUK{u-] N IxrFg)Z>+ B%j`th=]{V@L9P5=Z3Bvg6Dg 0]=+ {oXBHD6jgEFfTE_Ptmj` F7~RŖrg,fHfNr dHH[sts`m ڗa`/YKbN X&2:\ka';{q,Ch}lRROwBVtڰmرla =lעa}lVMtxkQ͏VH4I9 "Ys'G":YZ$hgx Q-UN4| ?ELg{(<:i8 r1+=ۛ/*I 3ѦsٱC&Z!͐vp;Bw6ٴzxaBN+ꀯT79/4] Éd0Fw ^y޴70템 d}uJ> rZi"zݬ6|탞 Cn߆:Ktf4>I0nZ}p"%3[r'NooAd GHᲁosû7p?y~ƱF;37d$K0|r_TOb'uPvX':XqC!KCYbR ֬vy3'n.䙈t>@(QPA5Lc^JTͺ*\[*B -<C?$I @9C !%6IAL2iIH\y1EDh߀8m Pq֗B.c{_m#_!] q}8 J)؋f 9JrhE^$>@gáPT){DɉWngrs;QdDJo~~\X!zO!֎8͝Ҙrr$B SD9*`V1U7 3qs'Aǹ!Մ%^ʠ ޠh|fJNPiVьz4@yҁ@A Qh`T!q 1"%jt98-H㜈4g^J@EJ|ϞQ* wH}CQIt.$h29A@ "95Z{jn8m%F+"5"ݢy h',J㉐ nPC_iz 3n5¸F$_qg$^{Tʒ"[O8QqPJóEG`Nh&4|)AdBm댿`l$x:1و76D_-Q} Ly3{j@ YJu ytQjdS{)n3\Fvw1! +yl(1<ܿFSq{4>"Jl°(_ m ?qa-q'{]EhBs W:j)z+lS.A:'#M "0>EdLܭ E!)íT 0(dRrkf&E2&_};܌GWjDO\'TgZ+Z/Yw|vp*P"RpVMt^ 5Q(zJGAGPS1$X"A^DQ5uVXI*\ȗ.#R7BV[P oiy7Gp5I7nގW(yVk!gx;=2賅^p2|;iOӱcYAINq80W;vV2ooÜ4m8wf_->VCD~y Bj)n'Ǔ\h .~J__;;.RPus3G!>[:m\MSR)NQzw;_;=M*elptA+gdi&QSǝB8~ *e9SL JsP^ZRK\xf"1 ޜegޙF-B̲0tK6[/j0UN@7)Z/:L:Xn<|;W+7RUj_ꔉ|˺T J^xO(N*(E)0K ̯p3dyqm"g^Z\7,gcnSEa1>V nU݌:HCy= it;vzWRtteG/=3̋hӫ_gZAgG'ҙaӻ(ѐ#$ДR q#m\ތ*f3ʾD{ @Ql9e]4b^)ESpq9? rl"e 2-hMZP$$(5F$;tFKnlE6J%_8. 1[,3%c)cF%&]R ˾"CDFq"2O3FAhV|K-{\E3b:d"yLѠR1-_ȘsV#VrbS޽|A.gw\JX;e\+ΖnK̳zmXx[ʶ6;4~we{'o@Eu_tCLwprا+ vMRnIf~z껇9dT@~~즬]:&ftWs(C9XXc!H"a;8[XT7Dv4[܇;M>(GOEQz6䉰k5u AO]&.C+#cYN P̦jbZ{lA 5o5ex9TN?u 9JDɴX֣Ϙ 1qaRL3=09?cb- 0u}c(PUKF+,K0GTTBb$ڦ*Ԫ3]-jބB)"G`Rp{kRi&#")~2CK=Ns vX2 ='˸x[dG -d>N'w*A%}Zē!U'j[Rqr;U ~70fڹgufBX%3e8a/X7J6w(%GZȤ+Ͳ|)[ZQk_Y#:+Cl;VY2=ET z(GupOדo{.ij_PX=ԕ u skq0P:3mQHUeKXmQuZLU/ua fЏ b1*OW)"i+?ܠC)kZA5iVٍ>|mKSxmo(JMwxY#[nZ"%ʴ]} `})8b<$fvB(chvIڞGQ@2D5 ՌՁWe\PAu%ZSmH0G, jjm_9'Wr47jq_55gRre؁1Z9r ~sB2Ag=?r S^Uqo2xknĭ¿E/}Qq^HFG|sj,oL܁ 9͜ɫyWg䏳Yd.WΉ4lJ"r7Xr6do圝UL?Cl\hVI7;HJmt)}Rqە*שR)+Ag6_7|yl-oؽKSx)PrF>HY}ѹL}h,ru[ʩKWe4@I?F?q`]Ks(_,̮ӰA| U=ڎ (p1]2DDj@!kF d'X/r:"/r:NF KLW{+j DɈ7zXR;tFД+G?sw\3")e|eH@EuN&JЮp'`#b"PfϟF<^fσ :t2dz'apic7<قtӫymHGcosuwQʺm ?{;P3@ؑB&x$҄<WOH9iD{؝j>ɬ}e[' m놗S%#I^63>}4=O`)b   j\ Av|/jy\[1M%w\V*pE떗5nJX镹l:j_Ɲ!"q\ȷm5;fp*O%-!0BK=\R-Z΀q%A^X`y)2+0 *]]͸}qg;e.[Ewqg=en#YUN$fArlZv Ѧ;],P .9].1UH*p[yU}Z2PZSv9U+pmR I-BL.fN0D2ꍖxC9819,#l:©s ~Q*_#\ *-QVj n'gi-@0¥^1c \7!,@ޞVI0!b9*KZW0kJyAĎ@y0 bC@Dco}A d*QFa^h4ZAy=V a9L:)1D#<"!XAi ͼmng_.g\h[>Yu)| V HF wo'!{}ᙽu)LdhrЅwq/ЛgFf:a8+Ox>XD[bGa}\5h[#B C1W0)(ט"誒`@ƺroQRƭ"@ pro!V@^=PBXv]{ڠ6b5:϶HDB<AXJ?k=w8 T EB9iǥ@Z2aWi p1 MZ| 2Zb 2c E- hO%CJ2A45k9=Bs̲)`$mu4`zߢx0ɾԷOP26:?wf777>ػ>~N,7?4=vx(~n!508%8? jݜnc6E/ौ$On]' {5?ݠ#i\HJ8!$)}d-yI0KtO:jGknlQaP)qA7'} ǖaR8.•r\wU&,lFi;c8qEQF&H&%-#x*m'."4"krsl"NZm)vG¨ᙔgkrJij3̼$!SH]̇hSiLB@?fs慐M@;tY&Ma! q}eZH}o۱ZQQ -w~G}TQfhEfŸ`޿5;"!=0XFC4+.J9VɔdR-7r68k5]JR]>zrl1.Qԁ2=1 dU(C@ ^F@Ũ/(ƚ_ szS=uB6sg"s=,ayAQhrBXKkGbv4<,P`qa59G h礡0{`HH:Onj H4TY#EA\JBiG`id|)1UMNLj5Ź};WlD+1 ӌS^p8{FaSFkz`S҉TxA 9[ eB̗VH]*h5,[G^}+y:֗SqU6y%\1Yli7WGG3&ބ5sfv7j&7M2F7{>+rHoPrJu܌j®+}GW z\TD3|Ƹ pW}{:It{;%beWKy6b,s}=*3s]p֕em@|8'QN/G)|%Ck]UI`]$R=T+]CX1 K\@"e5pkt/3]/juZV/_i}([Y,cp 6xKK},.wv ~cVNxݿv|?29G4>ܢK0EKyx:e/6WJ9>` ߅=~4;xzP{`BrݪVax&a8˾RV3 ʵ YZ'Bۨ(?+I)]ꒂa~<םNnŲeo]Rd{3KRnm4z ABz 02ov^cTj¨]!N'*F5Ө WTZ\)Ж\m@In0mt].$Vv$\v6\FK]h(3$†r-uEbt;B$9gVr",lY_ suuJ}PX{B0k׊q.. }!|T:N7@Q ڹCb}P2$\gW( iOG*"""Dt7Y`X0ٌ IDϮjc+l$A6mŠd@6y[hR>2a\CciC[%ay a ( < k"@Pw: Дs J=|AB3뉖PI 9]Òq%Ob2jXRp-20{ʾ PPmtŒjSg[IlM[>oIQ*?Ѧd~ֹ`EBM)Ig>twjQn )+DzMF#k(O\4K&r!-Ѻ΂.M "6zng.]T@6=.S%Ow|i*|$V H2LMuV'߭D 5f'qçt6@Oh8O+|<[R;4.Uw~hd{ӧ,߬~](Kj\Tb=)Vkf{wۿw3 H, x[2bU!sVǕ䪈łGEH'Ŀ"S |~]I>ѿvATOi%U Pc^|8mX?YEM[;to{xҋX4H?xU^Yv&Lդl0K0Υh/עJt:)uzBsOAsMDi\$$ͪFR,wa*tEgRdAz iө #I_fXR~ic=k^_f iidmmodŪMn^Y_DƕPE-]^ڵLE#FrwW9|[yrW=9@z˕6gN􌴂מy7wg(Y,3pR>+͞yf yvV}q {g%+ "G-'Puo% 4\zw &"hak}N8XcIrGLnX}n-EF?$ }gw[uÏ:KJAZGg5Mj0Z[K=WĒ60i)dgE<,'>M6V{lj,YWwP>-B Jcs(ӣ\3dPIUU5ltK}!*9TΡuUl?}oq$VP#em ;.R@F)Ӝy6Wl$^rی)A]f f8+GY#M\mqZ?|g ϧ<|̘=eφRB8?2p ryr,чWԋ\bgyuZ bZ"&ZYRr+zQF빠#— $-j g_qS_qj y;Iz:'zmq2iF,9Z҄ >a8*pPQ {E)P@}Pwo'ou|/SRT0%vdlͼ} VF~L[%"hJy,.B sAr)x ZD Z΍s;67c#R:,n 8t"j; (Om. ԨKM`Bt >GI:!ID$kC1Kh_U 4D3M?&EwO90ZTu&PzFIԆ38<\~|)3il cr"ru5TX26Tm;mWeW__A@r}\dQlZ,}[1~5ȱqi>vWrYu1,&v\J"{52;V> pު;^YS=̩*%z!ʼ~I2dZo *YA/TPy莪*7bew2o訌yj~x7u{ͪ"7t^2z?pg^q׶}m0Us1ֶf[ γ1awF1־޳)'FEnKuݩVhnT;FPBN$m7X&vwWNl=;eEBx.ʯzKAl'{Q> 샢x;ɗU/<´EJ>]T\ꋩmhrz =ܧxOE57-Ǽh_|nyNݳjݲʳkNgQQ8rj;\DȔ4Gbh7jjbP":mD푧_tڭ}Bօ|"BP~|G⻵ŠDt>v#O%lڭDK[LgH_O3IOYEaxS;g_}훋_]}#<+7fBw?XlȠ=F ۄkz _?ؼq7Mܾߪ*t-}uOκ^^^ĉX$ niӅsed\iz5Ѭ.2x~:O/'^j2yv:9kpT;z{M}l --VL::SOaK4%[C)H.vhPkcw%FVguNeJ43iC,~ eV֤ChC2+S>;–V-oYal$=Ow/F3^K{׺6Y!i9vSH6ثz'O-g!WƤ4!$i(RF -Ϙ,(vۋ\q˱(Đ:=.+YJ|zvSa`]h?31aOuS~&O=<`/9 مzyiL=׸+}l4#аB8kO|yVO?%|S!rc_sm Զiz/ѿ;nы5 ]7bG2(h]h"Yu}hC4  0,9ub j5g ۝^d>3F𥉉9> {ҁ4E h9Os3s Ŵt(CSГ :Z)%wr`ȉ˾Z7@ycG[ %ۧ*ŨKKBBWC+H1UzZ_~At/y(O!~,,[Ҕp*}5#??6hY‹-8~g֠z׫nCOoAYđO^F cޔohOrhf% ZZ+-yҨu^8>寛ET]~i /bQqqdw hie+1KjjNxNkjM􄆎-y7`QC{+`eLw2{Ny.6WS |p:: @T-@/}`;E1F9-pNxoze?inΎ 9̌to|x+&ݟɵ +i^m"ARIpC>?gK~EFX)aqJ2#`]A:!x\0s%-;Ms}o; v پnZϾ{ J.(%zkdM˜!L2LK^rj $jMS^W铽!9 >KBJ9g16SLZ}*GI5}V{| Tm92Mɽ*H )9R:aQoDҀ`#x̀]'JʦkFIqۯA%Os+yL=yzl /v:,:W56`.gJ';:JK& MJB:c.쥥`IA 4Kr딌^'98Er41Z!m`e6EeIk@Qs, iϸg. AB}L$RƎq]wgl[-Я?ڧV֫Ri5_!ډ?$Uլ=FOw>>c[?9333L ǷCTၛ,.[I3TS) rNvC$^N)RA]f 9jY 2k=͛oC뇯H-g:J^R\,-cӴs"y]X΀!h`r&7ASZ˿ t֫@$5"S8ĨVzFYlY˘hio׍^N#5h+%^%J[ȔqDϯoj_xx 6#~&p)|3pw훜KV\V꿹O/oon0L*fPz>S0BLݩoǜøaL1ڬ)(%',Aͽ#TXDj jjE . /k^KVPܴXE A6/>EGêX]b(DB4[cPCq}LF3W|PV(/,~c¼6ĥzmk=q8N *FC[н4Cxh%%ʌYMTL^f@9h)*_tz7aɦϭO @L`h J1J*R$J Q")8̓Cx;y[ ǿm][C3%-+-'##nyʌ50Uӹi(ڬsQKoéQ=Q32##)r,6 :2U3nS *(ӫZp7QUH`ZbR!N2 $hE&TRP>JP.J"&!.ZDVkVW;%uKw"/;D-Qj* /4 BQQ[\vzyE##qG%KIk)I$ 4%e |ID4L'f\[ %*M6Pؙ'Y?%("n(!F/B;$Q(g /P,j& M֛|& DQc`k=E91zG祠$Ia ZMg){g9%ڥ5 3[KNQ2*Wj$ԩ uPQ8J[DD-13h4pU)A+ Y:fA S J AIm0CXˌ!P`Ή1[@^Q\8JGjt^h8st!ąPRfI1H|Uā3I42 `+q f^S ܦ )#](-o+Q:!xvror ap0Б$~k{=eYaȈZ'w+S{K!P}06OjO)N]>.އq8~8~\=JۄެMy/Gw@k ;5lW}\پ7wonӤ[ގx҆pʖ4՛@ا', &ah{ٻFndW ,Y9BE_% )fJ0SR6Dk^3N('QFI!LKeLR9jήy'@4^ Uv RuI{]T- cxɁ9^$o.٥I ?$?w÷9`_~j]z+ŢG2bVvѾ7GGl]Vi[gv[B{v(EUO:%P ,6uYְVj~nə<,.i~sLڣQ|N#>Zqcbf'dą([Tl^u}c<3Sf vK/gD(K:@a9GS%ysX_LyDo6lQPKpT,`ҫ*zxYU&0Sų2AcY"0iDeIVҶMFܝ6>z!z7+1 a`p 3菾ÿfnoϮ.|H xٗTk( G:ч]͎3{zEP|"Ĥ/%$z'JQw.h |B|x}G!+5 9jg+Uxb]{҃h[vW6<%Kq|SRI~vcPPM#< `qw&46.W~г'(CH;n9X]IClŹ2 "$!d$MTj.~mpimЬ鈆*RKf-p=aB.nN*!,ͮ/PQ]|ǷxI(k`} VMր\T\SO8H:m)3N ?tӯVfbQg=L^x=ՙ9uS QdW8n} VN Q8kHe7U  hV]݉W9V++U JPE#Ϭ"R&:@iC=ׄ' PϨZ'0_sjeBO]tF"ljkey_K*5M{PAQ",WR E߈=%DAzxϡK<2D!! R%ETڋxh:!ܔ愌粽!ʆsY<=/8n¹ xwkL#ٌz]|).aJF:YtƖME~GWc|(/^;R2eS.oߤiݭSK0ϯnHeQLWOəxe|G*ZԢ֓LywiR^}Pj_<ͻ8fp juhTK\u*_JnS亚k} u#E˧o|π<DzDE+L,0?OIIA="YbwEKti3e#lzWBx+'翯'c ^(E$$4ֶkcks6̼j'5"ht;J}7"Pո+G"w:ca{ڢB=6A͍m=mۊPpKEtT`n[$'d7ASI_qtXPaQK OEDa[/Z9V X)>Hq`)b/^eҶuv3N E:V9S! vZŝgbE%I# Jc/Є:DmڄWM !^j.42H!-SE&#bP':؋i@e-ZXA7̤5]iavah UTJ%FT T46ƬS)Q.Ky{~SPo ktw TiH2uݡe}0HYʯ&LXlh*94lOՊ3u +*gm~o\NҶp/&>^Y'뛸ps^vGС(bJBq>Oc,'oz=;di7^RlAvE (B@8-X@7KI2& "5a}s`JA(ahEN cY?|!H/E2:W:Gm+VYR`A0*|eAr ''@熧yKW+ h4%q|׳Xb})3(bOǘVSAQ㕷 #a.pފ`DxdƔpQSa(:)hf>%SE$kqleTR38* T eҡHhtcL13[B& a@߄aXq4JW0o@cjmhgKj* &:gpfUz)|]S#hkjcѫD! lJ4n5]aJ,7Aٟ2ބ'-5Y3Є&4Y<d{ަSF tJ.4duq:*yK2O9ZzC,si >|1HPF @C:4 z{ zDUt9tFn;m_O3%Ϝo}߷"ttXZuNNj2ULoTAu[e@H&2L`<;yϺGJج<1J'5~xYtcHEtYKțj ysEO\EZYmEӣKq2z."QI{ZFY]#Tї<uc(f }4i9CQ~ܔ3S-}' Yw6 ynEn,F7'ڃW媋SۯF9݆T-5Ffwbrqf Elǝ39nF;OVݬ>`&DrS vlxvӞGv? |Ly@ZYo?,< QGw:ֿ|lj/qMsul*M?E<ꃳmXhO0,\zҐ\E tFؾO r֭9S.mWichV^iА\EKtJ6AA~j GuJź#8in5֭ US T?>H."鷵I&iss.Iwhm3b!aR|a~'1iOjז0,vpƣD,sK "$ )"} -F Ԉϓiz k7:Dn X\vkL5z0ItB>Sͻuǜ/0?JE P-K8*_XP!v~R,m[F6"m#a0ԾFe_ `:ʆSTG:oV1vW|Fh0]y9B)^^i)VZdVABPaĨTz<jV*ZjU Q 3B}pVF jQP }<|'%0)oњ`W;R I .<!rm4ЧllSM%;ۍL kQ)MA7^V&8p/^T   D d@udbo`7sF"bh6v*DH%!U䧵IJ*yMJ}@GRBF:``pgM)zaXhú5'7º9ՠaL9PD\X7V1iXuj < VˑE.)MtHc:-uwrN"[^$wdש>[)7&[YTV{OCkؕХj>[IDzx4t.ҢRhm{Fw[<=/'t.HN~Bt8[I@tnMp1`my.wGȄLuSF㮒MIN,U%)w䡂v9_؍ev#ɑ]ewۆ6<i $LԥTU-h0HI%n&:1 5* ƍFe>fעͥlz !~`[FbjjWZvkB}]ɮmu mBb.=b&dB&$&d E4Fxc.[ϵiP5 ղwN {j_/_Jmq O R8_\o1`ft ʐ d R+27[4АX:y퉿Rx,zjtq:߯t\Е d9bD"14 7h`J"9ٵp1ZhY| Tp&o~DCS!c~tpv~Xc] h c qYlP`vU@_ƈU6qm82mF ΃ՊGù\NVv^Δ*aHM) RhWBXI^Y㾯[o*6|0@zѻʜx *”"U``"p+k`y:젅߬&XOx~+s}Kn\A Yg8* FxGu]]5vrwC řR561?&mw+ !(:2[*fO[;9 la9 :w^\UU9c2"<ˆFUmro8\U.B0=tD^o6 %̗-m=2@F%l j$..".8:͜fsU㳳ow7_ė눛{nI29cs 5IҿySZhfL@5i\rƦsq/Jքd$`@O+(xޭoev3jrtPCCfH؜  ȨLȒLwHjbץщ1johȌi K32gXz'$p6̟\zlr}T3L!c,~ CVa,Fs8ؕX6[(gH:)aDSl)UeۚrL8~)nVE UK+/JE^Ycwߩ5` wzR 'ÆwWo968nAKA e*n;]Жz7G%lctsv;Nns&hH#W2(6,!0Tǫ7/OqI1nxeoih<8&SKgz<GoYHqeVJk@P ~S 2n8Zs&~Ѓ!,c5jlݵ pJP 4%VpE2UxMָj6~eeH;y]M">zi4W՜˯ϸlVPk_7#ova/Tz n1Tܠtbut/7E=wu#~a577n|c=7d^NW>]l#S]<߭ga*4sݰ_?>zyuJ9햮Gg3RF 8m]eS 3Ir"[$9<_ +rEtQ(|Omƣ]52'*dFSh zQ*+Z 'a"dN2c ds dsu:s|=wvْ UA[EN76god vN 'N l\7s8Ce|XUO2 W9<|r=ᅔ:=vYEPFxWQ4nCwwB yuka~8jӝ)RNd(oR,@,ie6z*L\n\KƝD)hd='ޤ)Ď:jTI n(=b 4u3O\Éf;ۍ|BHy#7kСhE(I{_m+0TObD^4Hx %j:kۿ VD`R Z*5y<oc:5Rހ(ߺc @P Mx mt/Xֻ5v c:lH3X_oN{8P4+k̪ N݉:p<IZWeQTB/=lwBY+b)&X_,N3NyP&ͽųDR Qo[xFkqFM!c)R9fp3z?n꠼GWaA{ 2pxZFs# Rv˷\HY&YZ-$4]EZnErN&K>fGwk\c\c#L<Ή,Et(U=g'",gS0 k>.eX:QzWWVXRV{Q4$xGݬ밉s'fr9 0 LO0Wd.8 T}0 w׌&X f)=,OcL\Xԗ'Fk&` =,i]~4bV-H#ds$M vA~3B hf)u?w30BO!VGw V@BIA1-cyei!]9h6<O#|vVpI}? E!oތCt*McbF1Qh!¹Ͷ>Ӯ[飣!-r' B_dz7F^U6T f+Fڝ~V4*ߞ Gb`U2Duodu}p_< w 2 A?[0{<kb_}0˺ I/\Do;*5\MT8;R >aR\X_ߟ4@ˉ8s4S˹Hqa!;B骲hʹЦ&Y*PJ:u,S* ':yBQr`9Ii?dERig\w^t>ȞJKmOWIoddq0j(3sIw&n{ P 7<-1쳲Lia+%|=<>S Loo٪pi+YW1]?Iz.gߋߨSDHWR&?}?s{˃ۏ|eI'uxOw/b[Obs֟d&3{(L!)RwmIG`N+Mm]yZPBRl)Q+lje0~ͯ= g g gE^FJb*RefN"ygYJvR{ݳb{VMB^8 yԝL,L2 u~D]%Vf"#REy)P؃228;E)BRdLR3"4cl F' 4ⰆU(2.(aFvJY Jw$mD &A'I05@&>Q~Q2psXG@ހhN8P Վѕ ?-ǖYN*I;pX(I1"Hk+KnN2 '`u `uQZƫTIƫtpL鐞8 p: 190ymD#F[E,V y'D ƀ5)XnBtTtT|}%HG)A&"wh)3#XNj [DU("νDŽaաQX#v@믏BS>ջ.-ܝԞswR{N=wQ̹C>bԘM:ɛ0I踈L Sb^uJb^uŜ"f#cMs$[= }GcyRr.y">]V!ԲMUVbG{n[Rl%ibFH&|gƭ0$X۩+jӁ;LkvRNuIn;Z5PT2CNF"{:-RA.HK1 \{sTiM:گOhQWE%Hk}4l~&orQޡiqeӻE׹n{Ggtnm#V&tGIgVZx,LO`Mzk\о':~e`R8<,;7>@m,'3A ) g2O~z!Їo Mau<asZ/n[3SQד޶azWM߇vk7_sroא]svw+7/8z{㛣oG?ϧk7zW}igNWa^|t{;߸˫v#K^xq\N_i]:?  =co\J@xqa küE07I G͑N(?]#9߻nW~l4cb9c -*]:U cslKw?3/dI?ǰ]N_A#o'3=K_4Ĕz?M?'7׫2? KQI`npq CuᄡN S~w3kv'Xrhï 3V0$ uk'7ip3_]NeBQ Kp|IWN'2w9Mu܍=p?4x[M2n\6Ifܿb>`Ν?ʀ QE0Ii5Fw//h܅xy2'f}sJ.8!L7}s5S~}}_FJJ P/3@BLajFi GJDQSyAb=-׶5~Nz$5u k@9@΋{5ʌ13UF( cSE]aZI`p z.:)4R8lG9Bdjbj g]@5 \1^`9X4M="WTi#@TO.VYӑ>Yӑtє-"N,tKNபG 0?"m!VQb1hBpNAYY @TJvN056IوuVU ⃍Lo5]-]&B#͕9 R$u@X(`Hs9U$``TEcW!0),k4G]r0 VnJ}nL8c=RrR`NeEW/0p tWw@w$}VR"ul>JMZɧ|~y)j>OۈO\{{B1V7ߞ](7{V{rwg*OP!yFU漃Y&? "]oF\G:z[ҩһu!&)9bCYo2x#D E}MGKlH $8/)~n0'Ɛ/ٌ@+mP̡)RG'w):AzTΙ+ܢ`Dē{41ד&TܠҒ|-&utrw[+OǞ 8ۑߪO Qs `ܝr 2bti\, ҙ3jy_Bcp57_u7v7Ri'vwM1v3 C91_Ȥ<'/ͼ+T_E|-}5g>b1g$cV'@ړLڳ߼Ӈ= xi>k7qF7tޜSIl^'*L;* $lٜg6qw]a_f ٠ٚ3깔؈̀M1Bhq^`U0Jx&I3(4lD;ưfj PgkԽ3q&)|Tإl1IpwqnYYƹ=,O xJBG9@]~[.n;:үAt@l6ӳ%<.llPT:`2Y0%ަbW;Zc0J}2MX-ߨ!nDTN(]D/%UYQ{ o\ Һ|F.YӞls^I{ɤ*>4/M߅dQ}98b\ODqjGל~n?ܠoN?ц"MAS,lcn9n)?$΁ooB4)}V@7eݛ9^tݛMY{Sֽ)7כ.D [E>"ߛ fU4f,z)PbBM 8Z劐Qiveu|sz/{.8"y{;wZr]v={6 odpFp {xLjA#44uVIa&(*Xq^Ds=s=s=sݜ纄OQ0%2^snL]By͔43¦ye'ᵦC{ֿ)e{LnROcL$aWu<ĐZYb ! 85E1jcB^*aDB2x>-j?{WƑKnx|H%U꒳+޺~e[o?4e9fH %GT^LJMxFg:EV?~{>ՙdu&3Y=%`tuU`wHWǤmvۗaCn~jI2@Kf{r$Ro򩝎sd Wz5.5LmiI~}!;%D3C7/i:of~3mb(֜j|%7uX:`:RB.aD?-m#l=?_51\ω1 h2\W t}+,/2U/AԸ\*$ѰO?|[.b[NZ($_%< 6NU*h?Cl{V_͌yqf] \ q5k]R 6*>xm7@Sjo>]X2|f=0ZQ߼ Ƚ77܃܅oi2SmJ +-& |j@/>-(9_4FՎw$Dy|6dZg/cMsP}4͡1o;ێi1>ێ8 7ooq-?6k %^OCT`SXugKq⺍* Y$h uqw7QӥV`ǒ-k--xp:PI\I&G[l@oCB,gC4!2iLg8 Az 3eL|>g@t&uvvJ)RVgM0Ak50"`7ۯ je;ޟ<OVx]!h܍V^'ڜќ@[Bg!tw7H`oHIgxLUFZȈ<(8ي!h*݊*ot)ϓi{%jH.iը%}[26{7cm%1Zzoo>y.kJ+%\r+_W%/|YW{s ̮a}<8O}V(J ˒-#a w Z<ͥJiVJ4ԕۯixlڭ8.W_zHGp\ 9p1dk\"xB521Q$).VC9`B$2so٥2_!Y rdDFun[9 x:HysymRUXy4ɐDBQ(ufUiWS!6|ѝfc֟Ҷ~?o{Lv#C韑8:?}Y%魒*۵U;yAoze?N_e:7{f|;7;8.PAvr1,s/o޼}o~y_($XK7\eWxK1cw/"9HzڽŒ!k0uhm$"3>O9,Ώ<;Q Yh}Ł.SޓGG;,,%ŷ(JIlNkM4P+5,8iz$9f%xkԢЅiLYkuDT93J %(NTYbRưAqL!0ԋsHVՋwɮ]e{ԂwtC(FbMY|ij 1ThKJg&U}/< AIl dtC nO +!kJkߺ6r#f5xyfd[8_KQAtS[Q$3.j~Lu , :K$ŗBU ;T}0=\FkzwBSu8;9^%]%t>󼆘5[ b '掝N"q̊ NCÂLB*(kQ jqހK jK:)}cs<=bvRi&11caC "fZ5'^ǥ `QKz@9"GP$3%],L̖$$JR..IK9lvSbY 2o>\}s5Kz߽>a=6IқtUJ۱Dm$Q(ZIOd,(9s9DX"Vʵdi/XM5D`/$n9Y$)wz>Se|A˾!p̔ }QA2}Yp͝(ȵAd$aJ£"_q/̠p3;{ qV /6#{╥50 1k,bIVd5AIL4B+J̡$+)-TV*hj- Šd/дæP7gǟn ktUj >|x2s%/ِ1#Zb,! bͺ$ TC1fgUbk\bՃ""I73׬LIۻKE$4$% Wg \gjOJ]H;ANi3ruj,H%UPb#&xYu$vEH^y< gN%"JaI{HlV;.$D뗧דFW%Iw^e,IPrB습pT¯Pp1a DArc ^igV̝ 5;i{wZ~uCŶ'Ἔ?~??/LFq95_sӏ.v]V[/O/?;\#xGKuk Fh^] ~{? g{c} ظ /෿RMi^ją[=Z5/u۳&Yl$Ar<ǣ(#ƭxlR 331iNJ֭/% Y8^[x}F?dg}I tFjl4Co${ܷAIFIӪ}tqj&ٺ܇I%ǸHdiˮ?c1E=kzYP˻`镋'[}EӇ6d@XNH!m@2P#b$I=y =ؑcϪã cVne։eNu~zHjyDJ裲6Ig$^@V (/{>[>;a.XVIMS"vĝȷ0cm|?ncⶰg\X`@E=kzBHJ7}4 X$ ܎JM%J a}#K5 qJ0.Olhz5jtlϒU{@ 9+j <FJ>l[h**^* plێ b6Гc!>xyھ5iGԁ26(XrYm/t|<77 ٓI،ܾ%#`)bkhֺB^V K-$SIzcàrݖ-@vDdM_m׷6=DEqvtSz Q[dVMRFGCR}k6G K.A"8clGevZת-FWJ!-CPoe "Qe*B$-5vtndN0@DjqHDj6$4+].vmK]#Zg$Ll:tb 1jĨc (5A\ck,C/=Sqףr*u5t9ckU jav7 G-? ĆK- Emupྣw'ClvCSJ>ڍZ fz*`=]hycBmmMP^D᝿K'c9n![ic\h_u۷fdH2mW֟o~kP)3x,)6ѝ0ggQe}+xJ@٫+Ѓ ϪYI~N;b8PygGl v gƈ敵x .H@&CH[1B4L1b5C=(:$sZnf;:5BF ԡx g4G&yL,zK=ioȒ7K6؞1İ Ɓ$67p#}-HHQ2A>>FbtIhj%mdRD8Ki{P๒NI)\j*KJBR=(Hhg΋l*T(o @eP/?%E'VH/KS}@/"1Z yR.4Lcƹ$EbkFɕM)a><סZsPuPט7z-N?QC"ٸPd:s@ i8$ 543R.u>/u*H$S/uz<O= d$%D P}-tA~զZ w68>5Nc$0nN zК qY*8D12cf z]θ&.A L6EÕ$w"6d^-QTcBԄ`YY5e[윳j!Iz>$A# IZ!Ca]V$`;M(cf;+vc~պCZ0C y'HM5_0mm43&f Ô1V H/9Z!FdϮ?$l>L׌&n?#׀ 4Ou/`q&egMQRes>L\Fg u)Iřqd%i*EcFAHd63c㌺TɆ?-Yuf! ӰiRXӊ*.rP-pۜע-]cRM,5-^^LT"MdMyymZC$l94GüJ h:ix@U{uf5e.Q'3]<`-7|PCn GO-s VHS -$* VuI||[=OȊh[?0J|}RbM}K9s>r.,Ta/F;q_q{;TMUoN Vi5pŤf/ 7fZEBigyezF&@@ls(154ѹx-D UQ ^Х7Pɇx-M]EZIZ6k+6CH`Tڦ (rMa}%ݫ#7qGEw]4B~|_cJJ[I` B&mǢ)Z+)DŽjc߯-&r|c,EWpc({M_ _k/=Z$Vgh-_Khv A%|x4*Fiq!4KwBZBeDĹU'Qz-WkRCP9.j*"2a*'ky:;i,ȔO( :Ԯ%4o6qȌ ڹf, ʆ(mX 4SŎ+r+@%ZWT{j0F@O%#"IeN '(-h9p/?-&x9pIJUQci? "6P hH0!(_A}@˙49+OI(&Bk_+E/V[-ll) &&12^ifj&VRʾ~syY[a̜o?-{;5DF<Zuq/Gcx4vGcx4v>y46n ݆#1%UPf 10mc,'}.tL14"z9 i/߇ ';/Gu;jE'FXq#PBȇ@a3^N?(Okw4[&`zl"Sr%lur'$7Ń;٘A?+m7H1ԥIli41*NIzi"IJ ({(Mݬ4?H\_jp(a8#hb+qxơ}7\ HeY깠F,Y^.9ʤFL4PiN@>!"LJMhz˝ќ$U=cbuw6Q!cIU>x9T+N:7"N~3W7%L*U`0TT 7pmPg<(\/&֯PESP Pjڵ^L m*cJCWe̎Ua^K7EUvc %p$NF99hDJF%2/@;X%M!\7J}+b,ow/gBtWx6e(sW_}anQ)7/|xۋ|0IzVtseKt|* ,+7y <5'|G~þ袮$4Ӄn.B&< B/LL^]W%#7l z;nт sQm~.N3cH' w=ִˑ6?,G9 vc߆U x.OΊ[77h~_&M1|+uC9l ['%s ".t臡G{Qva>wl:hQVhxdnz{dpz %m'ڇmBA<L{0smɧa $Rw> !5 ~zpz.`~MaW||zϯA6i^W=O1QgwVn-~oJ_pvᮃAŪB[Q/d0 L6^,Cv |ed$;x+_63Uq 5_2pcַCs;bo[&߄&2+(Kmuw98(Bt"A>/ 鲆L {X d1, (Nh.n,fwg[wx: g[c}8~g[+v^o)ȥ1`%710p_T kZEj?I[Þb)gJi1a/]IkdsFF8(p֔+bX/R&wP_jPs9:"8 KW:MVpfڢ;&1M2 ie5A-&BzuѱP 0Պ~e,|i37jC^عƿzlj-&h &dzK.'xƬdV- :RZÍ-g,S b/g1cFC/<@CNU?7NgɮQ{tw f-<muckL{!濽q,0zbT3Cr[4-YTOhsvW'iqQ8&hmld]$q nj~o|zj]҈.iB띋ǫ@p|Շ`w`0$?*D{A-9nw쯺f ˏr3*_ң U-79?Z-~ݠ9xdAfzJa_ӣ~77ytyBx"_6@ew{GQ;j ^9 a~ J1jw#;Oq:S02w=N1`B@}hڍ'PYB$Z&P<՝z)JSnuaPtQ=8lcܭsI hbR?y~Yƞw 2S*4n6ꧠ,VA!p]bhiSHN{mzF]k4x_dkT /mvA弔a~gK쑲Ia|%(㊢V MNZayyBS(3N$Rƙ´>K9c@]qsk%C800:j 3 0Xc6Fm@0P۾Me"ٻq$W*'ģn3ٹ솃ϲGY*&/(E($h-wdDf"ލ_+*@qn04! *Z:e.-MTfgx#2DsIF7wr=-_j2R i.+9HPVQjM嘛#*wNw2㿍96w`gd/!*8cUf8B3{b_۟}P(O#Ψ꒹ۻŏ-d+Q(ҧM/qns<6C >g}GP_?^J|yOb51) {Q,Yb_{xe{Urei璡 b'-v̟6ӓ2t.t[wg S,bq#G:%YY:*z{[=̩mRyv}óBHE=loL;fN-GVNc2)ucT5.-t Z:=Ya9ȼ즏QS#]xZR"Nn0aL f"8aӝrj*s x4oΛ(-zbm^9@ǻcs^8O᭄00ྐ TN)8ɅE_5nj}їOE(`Lrwk&'f_^[r+Ooüp*/<(8朤95K,Q:\DB*MuL#$\zupac5;[ǯecz\\ dT=fL.{l"GpkJ@NxK5^P(: N Jd)e $jyGcxdy`c+vv9|ϊ K^+.N^LLXReN1ԓpg])э-| yF2qDvQW&qX B:Moˆ.2 KIi9MA0NS("%%f8fǑ엋m[} C",eUsLƱQI7n%0Db΋4`&#E84IxJ4IAEڼ>n+G9i*̋Jch#V\CgDOڪ+/I55%gsQzUq, oNLYfdu9G8a-ـrW/jL+èG)xo+\ .WuuVSԇ (x,ͰHA"O+%qOV')x}_+/^ƚ)PyPOqn~ʊH/T`_.&jDU41zZ[9`ww9U|_N$|~R]3آ̝#a]` 8nCMHN"D?ǽ: *p*V7!Kk%bZ}@] (p:R6T==PU 看ϙ)q8?KyӑZ8%1j~{q\ŧy|OqXFխsΆ&}p%~J͎=*ƅ} ' ~1?G2(ψQZQhx`cCg`mt9dY])Da<'5o||6Fhynj(6J:\ѵ̈#zHU6͉ ~.`3@ʰ!2qzt}C\պC؞aF˳T{NY6@‡ k PpFŕ' kaxǕ!`sAiN(WL4^^ +)>L KZ 1?{85?NP$/,G?rԹd8O;$V.7uy{]ݼϜLeΰt(*SE3$cgBfYRQh0uy,d`=z ~/hB.xofde"<)2J(<ɹ"$KǹH QsKy2+%AAVz6]i=CVk2L/-^Ӯw7HC`A;d*^^^-"*e;ɺ6gfGNus }_̂laU@wNv]bTq1UBM8Jm%\EZi (N4*Rs^Ś$Q;c+ ]A&#N}n~-jVѳ@2:oV<0זwiV?{^<v?+*Ԙ[b~E<"&kJ=#pZiks5RD%tԩu{ȓBhXyqhf:%4⍮:K_?뱃$jbY[);-k `/ TxS w 1(ǃҮ hu ͨK ǙվIC!lV)A.D]Zf)s\{MȗLӊyJ 3JQI𩮻 l.EG&b3c,qFuXHd%궎"dF%\Qa_~e{ɹ`>&4W Ns2qqIcƊ`qs1 CfÝ͙#q(&ZO{c1]Yc̈́F ,MP`.C16sE˩rVuO_W3%vt.ĝ' GC r-=')tz ,LWdH)B⽌~ D?ߊb+K9! N$%54M"H"# 2,8V"R$MrikWXm׈  rOh {V}Aw9(Ч놳^܁넽dcֺ1! öycxz̪1Q١ip6F-\^đʠz_ډ MFt}땩vj|I3WB02`SNRUU:-TMoN% %ŚiM$E[JpB@EA[Mn6]7ܩv r=Q"+uP2XQR /soWcTjEQW}>JL1bU/b S-R%1bMEqoj oW47Ή1\=ݺف6p~ T(1Źq=\)}.y&{ "O/%>"gv ss7n )a%}E9wXga9|ܭr^ .#0 )"@RD*qi\hq"PJêb <ŽdEE;G +}qDe²O \qIX=|uDgG1G@JĊw̎t].X9Z:>c~+3;{kI pz5#Nφ/DNDpF .\*s`@hPh."AF,FHYһ``$h:7s8 aْ8LpyBH;\(iy| D i!댗D|SIx|IpV;VL.7N36d\)\1a\Qk5J>5}JG9{븮:fj㥗myŝV fjtuw$,R,"JbmlȢw4v6PvԞļ+N7}˂dOAΠ^<E%܊N ?n!~Bt tm@;fCSQ/暍yV@xMa^e~X`Ttgكb_%\l#3va&Rij6H陴#=u4tx jMMjaiTWcsBN-3 J\(duSn^25sY~LLSN\ WK=`.ۧkl8pL ݨٻ6-Xjv2&~X,rDIdՓ;vl nc6$ZqwuuU9N}_=ÅlS<b}; '[Q*ܰTx[ knQ. f^D-g>'fռ3 "'N.\ 0lzS:R0̆,Dkdpoiwı>/%3El@5jKe#as@%F @*Fat`.}R=ٟL!65@ &:OtSp̳3C-,uwpյ>#C<}c+CVHdM>U 1CH0XE ҔC85:q)mlm3jem3}Hh,o^c,3}4 o1@0J,ۇIVFSKÑc'BD LHEMt]>f>f>^.0z'eAuuTow_u{],,/1ufwYt hA"M6wz]Ó=333}@(N$ȉıCriM # #@JAH#h2_#6 F%l% Tj@Q^&zcY׺&vCgkf|:wDcq+&9KTVt?| D"@CT״2 hUN'-^ (!M'_w:D2anRj\$ŰO?/]OzU /@}>fz69~\'fe0/ptv{:I:fzIyˠI2b5_^@~z~>:fIiIQ sah^1mA I_ L?v}E^s738 a9>,QRR\A𹴹+rυp{ 'D.ҟ|*e.URJ}r~U7,[eq/JXIs^Z̟HUq*OuV?;z[LѰ oώ,M&a r*T"T5\ gjQ-Koz>p&==?(N(tz=|w`Exp:.>v{^.W'7y)jPvx-cyhl.JU?`xi)\dTZyn.'3y j{8}ܧXnH*qQuC(nCWܙTwo9 çgV2CG#@Qg5LCT*U[i)|׌M.@3N|ug[k@v|%bx8fZ}-ȱ Ft *B _F/NrJA!x> F/ڟ:3K4iR{t\F)RA;zd "X v^WU9̶R7[򦋗*m>MNFg^UZ7;RmJݕ.\__\mpH_QloڑT f)r\V!UDeq@:+5G vD/i7Qݥ9*78Y( owL챷 )Y^P~9Yx}ꇳ]i~%vIwIۇ X<  \EΑ\+CBRILG :MGیZG$mc҂s`kVf5KkVf5Ym2*ks`o#9A' _kNJoVyDm$Ƒ~$6+&{?&k[7HL?NEWچ 3 )/]fOM5#v2X$AS@$E GU$\FjRB?paKlcg )N(*4FUtDS78kdF6rE*8"P D*+[k耈eYEފ`\`ōco3QpIj#X7-3jF̛˫A$ƮmoT$Mξ6.{c m!? 7h~o'O:k 7y_O:uDA uM(=nc!&i7Rl䇱F2^R:PkE?Xx0Pn=4r7{d77V o f7<\5Z,px: **-\e,p9ˌ;GbXݭN—S:6zNO7ixF;A,(MZ]LRU$pHV:u:׳F/fѻb?x\]hiT`|N}Pr2vi dT_͖:z)b;W^#W[@kvxо{TH5n61)kLL{f2!*m%<“Ac Q-e 3$2"ô6Ro#8~P~B8|ZWp\W^Q׼t4%OÁP)T*B[BpAI" aC iѵ<'7㱆WŶox[۲Dx[iA-GZIN f9.2 펣eDgRj`x#7M7DiI6:A QzFG) N8QJd ~?&Wg4zߪc.(\ !{t8Vr,!ӎ 6)#Hm~QT %~]~o`۳?ܽi^W>;RcDY^xbRIa~a16$b+\M_9I'Nǝdש#v|1ο;rܠOYIFw9n-/-Ħ$ÏrmNc .Ĺz̓0n5#ᴠHB9bA)_}!ginNyj]N ^VrFj pͥQ '# JVƄ'Ƣ40J(+eJ?J8`hCRB-f9WJsTs˹fv4a9q^cB '$`n? Z=EgJ굒-"A 4t|F ̙LF'Z+ xAk ͳaoPڴ*"(8e%FJY "XK$yJf\N(!q!㭩A Oh = F"A8"Dfʕ+j0Нg|]JIE};po R)7w47_D'{]#B)N6'cT_|&\#myf e2UpT-YNu2.RH\9FToc ,c`stsj h 8CBrzi8Kcy,h.W!>kuz&R`z-=`.8n]9lBHBݱVX4+c-dYN&e;#i42%nldtQ> S$ IVg` \[B .TiOV:4ʹ@itCéJ͍;f2M]׭ x.-M'f~PpiL`4m-\32@] hR)hGO8D {Up-3VS-,F}$ &cGH?Z1E,{mj?^ 8 lK dl]Ҭ/ Oqf$4f1kNYX,UzӾ$iAN'W59,*K&ĥnȡ -tg&F vĆShOl&hI؞&KU,)$c 6ؠra&V-d&#-h^yɜBvX?1"Wx[3@t{. VH/ p()U/iHbsPas(?$ =\aRP;rS'r󜹨 Trk\sMD<3":1,f )j&x18bD&)g9s0t~NHd'ĔPbK|;Zu^Ţ:<꼰C9ZF[k3%ajq*rqjv @16Jdc-IY]ydCmSF̴8SgMۙj2r4m!h9͂,n ~9-u A; Ҙ94aIYeIF ^'\)YkQ{wKy]* ˴yKb3(5p&]i :,6}Ijg.4!pW76ȡl46䇕4[ܸsdYh*x9A_h=Y}ӬS!1ƃ`:^6G8PJEmf_g;HVbU4IQ ciW6Q;#L)O#k8A^E%fF(yL G>DrAZq@L2՞+P\>ӏLxV;D.PdV\g;HHAޛp2ΰ9srJP )n9bU#): BpR0!_W4mBbﺗ2| _JJXE"-fϸȉ)aSr9cMIޭ܂^ zm0 sBشF֘y SXo1N,A޽[ڣ0L]Rc$!/cz}I{K /ƺLjgzV}r%L)9>teą^:r)Hf ;:'F;j[M391}O~٬fͮ^Wl۬CIM:MG[togF>[;OD[^/jF$kK4nOdNN{mMEf#wEd1'kL5hwp!UK/~h+FGɒQ%A'Jg6[+l{SĹw~mr?Tͧx圔FzNrXɿu>.^$M5 vBz⑰Zmk/jmG ᚒ`nMLXꈃܒ󴽰vRǟ]R;=hv:aXsq{5>\Dw5]r+rȷXBFX-\޾*2$s&)_KOwSmhk~]E'ЭGB{`sګvŻjDwV0%8'NB:VOb ؛N b >~٪ng2*+6O&@0w.4rn.fkm5zw+ `I:3V<' V{-+Q G|Ԏnr_M?^1f)q*⎶˻m̹'M.|(6&Kf{~\e6W,V*Z885d%?\3h)mWcm&I*i${/$nFC>1Gsٲs({M.r.1 :\K8gr`n0y6RKo>`^&u|<;)֦>wh@K2 n*kBx_n_i_H'/?|XZ5Fr6qLړt0#kԾUG`^ПTJSf0+x%)qaB˽~|D{>?9ϧf|jyjfZ1rD N1< 88&|@'Cc@%Sj"?6)O &]ԋ@O0}:Od'q:meL<\emuYw|g1ζ^}ôjv(Sq)-]l3a:rw uOO! u/'S+eܸƵ!on.sRdPUc$OX:oRwdwj5+>ЏSU1mv|VpP("^j`[fw' p#ܬ/?Mlܯ?0΢F#\e77/vէ/ofmNfoeOw9-¡9odSmq6{g+6oӣq=ϱ^9ja[ ?OPg^к9݇3t_9rwun`շvР22V`hq{v hN>pͶ6+hJt]Li^_KrՆ1 /yGǻBs[RڲZPzCP9Y-R+ZLNMjqqK-y{ 9yXΞ{e[p;#c|/4ZkH1GgO9ZV1håE)3/ÇGGqjm -ݳԊI-cr1~.x㭥x}E1 7{|hР@qA@}q 9}o !zF-Z Qgiht6/T_fp__틿}3]\JJ..ֿT#.]p.@gu|((-!bt\"b'g ĜZ.A ϓPZW~2t p=n;{'<$辞$ЫM)E.`MvI-gZiQJXZH4 C]^S2WeQ|Z=S @Em[{A$=Å3e2F:P257&e Ѫ<L;/$gw.KD X4K>{~wЬ[PҼ1y(k˒1XGPa<@mi'M^ZbŤ 0.ڕs+wq[58^% @&BRiZvIǀ0*ԂAs*q- =%N.UҞ2ʈ:IRJ3(yGJD'ȧ Q8$Y1Jp2i.h%zjۢm8 TR:*\^Zzd:Hka1d@jc Xt`z:|^gB#NfM??sǝger;bzBFS/*d E/zDXNRqzG$$EfC>ΓHIh%x 'LRs}u= =mףu1[9[a: J]t 1VZ,"o}JAAB9C@-l6pښ7py&f!$YNgVwDnwtkox!D%#nrĠu &;]{&s u9\Ŵ4]>_I O,/g-Ÿ?}Re[T/R%80ߟ6sRJ\^Gٻ6n$noUÞlme7/q0F҆I E_"YǢ@׍~`ݏ "pZ*Ǔ|x*jݼ~owf2fc3a<D0.%o#.*⋲wуq0(!CSGR6zִŧ-=U2w?Lg;h&)`4$DSW:U ;'j8xtߛU}pg*]8o.Ə,T{ozM)['~+PBgOWx>L_,<[oNQoЏͺ ^` R7Hra1l7*\5@\K%&KyynWd:z}!, EIB ]ͯ~eSu X ǧJ>7`͆Ꮹ'K [ٛ&iv{_r~vzo?O{f2y@[7_)R{uM]н>2=W[Vnv#+Xk۳ջiz "ScCC訦ha'k X0X(*Q)jR0\/JQym:DeSm(.8 TYf>Ǣ[T;2Bˋ9.uJR_K`>FJVN [mWkJB3 D 47' cT:'w׷=_P2/1FW(dާ}g>v`DfǢ>s_mUDP"i_q&cZyi4kf$ʘ`}JFd?b֛XcЮ YX̼/s=k,G8cڴ~q/FqeSUIC1͹j=BSOArdsW 5K]=O7Jodl9ExWi6*ow;;B-*.]/Nt՛U fԔ 2[Z%O k!>N:Ի#I%CRUJ؈3>q QD 1afBiAιq`2SfXkukA19]RrtZ6~XF.&!\ FgYEYQJ3%B{krq*d")(kjpʘ<(˼׹SK BNro9Œ:$39EҜf9soh%AuTi\DkF(i jhsa@ ohYyFqAua^Ud DfeJ]q}P)L{A8^Ȩ EuXCzjZ 2 GS,gHg'agc)cXg R'J R'_ Y9,PlӠZ~07 n($3`$cDq5Ii1(9K0D ~qk0-vZ?Lߎ `S'I+$S$'Zopyi3eS/wKM :$=sc,TnBr$W jz22!9YwY"{8 .[F-zT >Ӫ7p/}IST>-oerOrm)K-w,jYC8[*I4d9R;C3 jJ7{2Ǹp L(fƔ6Nuv+6]&ߓTt\;HurR1e_F.3f%i6lnWR; U#|KPD"T͝0EJ+]\% f`E}#=&g9V}8x JAs ڑ4F]ou}!ɿDL*J{"fވ{F X Da#D@D#',QnsYdw O%LfnI ZPi-9YL(,{OVFƠatsxf*+G J* [^:|+ ggy_b| 3IcI3KϿMO #͎Oe+Vk+w) J9Ne }V難c/&8a<x;a 'A1JJ*`4 E( V scL={?T7mҨGwD\g\3ڎĮeyQUH0c21ASxb(3N Si# ̑ m~vx"ǃ)A"~*k@yg1rf[.POt(5JpCڇ+{+AiYC|&˧ Q0߯tԲOו 9)īMET7ý\};A*AGs2yw˺{Pg0 VA93J&_7 hj Dm~k7Z z. ,P]JbI[Es ~D&U=?(YMS/ƅa:U'9*Us; (u L?ڲT8p3zO,lGOC{ͪP-<~Jো"'{WyR() _L& <\]`_MTdzd~{J-4WI)= l+* I+"VNi7*&-Iv;`3햼Xքr_87>)e2&Gxb׋A8wwK´wVe XCJ QŲPP@HUUN%.f&p7y<>[! -s;Br7h/"@ wnp>0pu؞Om7* updpeƋ (Rp'4&s&!:* p6c[+p!'LP1mUyVPB6TVװ啈q LUxiB1}*)Wyto@*d754k8& 7Ϗ>/BŲZ8zjB\OG+*ķ[(Z^<-W`NwᝳM "P8p?;pa='&W/)ro$G1͋ݚrŸp5TuB,Z{ycG^`ﱼry0aVו\fu_i:ɳQ؜@;D^ |f 'dWӂ*-%K;{No4ey+{|RK-UQho!Ļn,ފkJ9 ¢Zbj@aĤFFӝ3s3M7Q)e}vr/G1Xj޽+8oc:`c^DqX"ҙMxe9cT!z<9ցߦK,]R:XDUh_.2&!p !"xYYüH:ggzH_!2h`)yГ^ wm{f^z!i "n}#T!f@eŬ/"3"#&GE#R:0{ QBa"on aaB5ʌ^qĔx Q7=.u_&OiZϮ Bs>f;ʬr,ٴbRD 6xHyBpȃ[ hii4&j4 ({('t(L-zv-&gW=ƌBRQׁu,Q=gQj3h ^ <P`']@DpԻ5!$gjr5/ 3&̆Ǥ7N-мϗD4lZNZ2Lt8ГD*kvqPNԸ ,aNdK춞VRJ0LJPO'I^H2+V<ӷO6ZƩ_DZy!xLo '/t-ey(|HѴx-uGs 34_-:)O#JN>vtmYM|R%kTf;ev=.տEcɫ{5\POL^7fξ64kg3ɄU)4@d :(o^>㽔X4~̼is3 osk|zgsWYW~?kl=ӣ=!p_?pVQ% 1wۇOjL!ds?jnNS|q7FN?=߹c5> \EZt2%@v?/JaJmH.{4|-z@@w*,}CZ\g {AD{2D͆]e2kf-dg'q>vEfmǨ=MejA,*U/c^'.I:mT\%k[n}C;-Ozv -DwsMe85`EWݝXm- 70WCXHex%d%twFۡ%9)we;9(([wjA\69. nh1 єI>:v9 8!?yγw-7՛qt1iDRIgBPGtfdq%S! $qFX|=NUr-/^>V# rjކisEhX6C(XR&fǴi0Z@E#5Ih4-S;hiAkkYbRg@iV ǴvXuu[#.:;SωB0)V=/PނpelMyd!ɓ8EMR4QTZ "U -UVu u23KE5M DwX{XH .AoSJpk=~(bTVhB3'ձ  ȭ fD.^)ay>݀Z͎o-ܚ soV s']Lҭ LRBzz̅E2$aT^y&mFj*yJVz\L Tīiч&'H4\[#َ;rc0Fa0(~"r iM1~YHl] HP冇cy>cy>׏9 #NGꃕ1&e*Pb87]4I's2+7}Q m_3AR;^ⅆݢN[-[J(ǎёUN vO9銰zhHoZ\DYijʪZ5/hAho@ !tݲkYE-ٙl!7kΏ ߢ{{3d1~fx҃ 2e?eetEVPp:pyzNc4?/=>y8,ji²49 2 oc۲ 5z\MK}Dsrt{T;_=G"rc>Ɏ q@FuUr.[w_G`OGm:c^Pݹ@+8Q^>:5DBN9CyF)$G F{ :`Qrcn˨MH֜8ouƶcO j%[m({'K݅o ٰg-BFT>G5w>}O|a=TMS 6WBȀ!Vq  ذdT /v\2f]B!{l*eǵ+kqx;K@vmr8 ]n]pbp00I[SS4QqAʾF+fzہhRv!HEҽ1E͡ON|Y>eSuܼ)l05'7"Yri ? f c59;TPJ45&-HT3cX/)i(K%/c4 B` R 04EȬ5x&(À,xƻ,pZ-r߷KAY'"t^0NF1p*&{)rd`$K$~qJud:Jp'`")D/HT"݈=tQpⷽ(h7e)Л*x%gebCkZL!0v+mp[g?7%ym ]z8=sΙ39j@I.:O~08Z1nk F aaAh  ᇝ=uK#D=شֳk39n`&PBGkm-z6_0t9KIA<$ r[+sSq6eɐ&*F&Y,q PSPrLsуm-k%X2IV ` C=O'I^H8 IU TáxK8Ld TGO mpz"\ z:ۆÚq){-Ռ.Ljq4gsTzRPZkR\DY/H$Y)ZIقՀzRa9;ٜ0lʎY}NɎOT|θ:= b<>{PT?*ˮnNn5OP'tq7py<^y PxƿEw2s@*ܬ 1cf8Vx3 ht˦)Ur:Uh~_<+͛ ]ۯS@MTkݧ-k$(V-rL3ڙ[Iǝ$'\$PiaVI5=HPEp\sK4qh SHtX%tkd"IMUܼAfoҜ aLU^#78+VpD49G<$VI`x4xW9n9)qml$szre'_s,FE_13`VYQUd#onN%RI U,\'OuQb};>x?fSp&ڣ=R_ wׯ^MZ<9#Fs%, ŨQMbi=O4&M5e3HSQE2_'MkW-ww%_ݕ=tw duo g|8a;JϾ@J2E yǠ!ߔ}v1s |0=V1TC.]o/gO/,{å32:|oQN5ņDk;Ë; {i'x-p|z -=l\[x^'0˘wg]y!cLsf߻&)i&!aPT# 1+8=X oGG^Si؁W782^&۩k^ yxh\S^xRm{f Aho!〧 DFHbr C(Y0;2BpvZ# .훧ݮQMq*&(\|D^SLu@nGnrg#sy)U1mVT=]6 Sr5d>+~6x}H"7*-~Uau(h+g'w~>kt+Fs3qw+ys'{k ~?yxS׷3ݙ/ѫ?}SJ OR-X/L0uxL(T ogy⭢P+EkR!Y!>{Oٳݳ gX ԉ(mHgd,o9Z0^"u&z,?uSyw ȡc,oIȞ%Aq(Ԗ_<҈*N-'ark:>fb:4>-?/6>|㨘5upZ!7@}Ifީ{hy_AC(0Ar9d sL0dczi噪_uǶ_ (oߕǭ'ovWhN1a>H缾txVr .q0ԹAR@J Js ƙNDclJFO:UX3g=IBp*~m`8Sҭ?["lQ|p>V#T2Ryw}Io~- ]nzi*^WjcQ5Ţ"SjIĀX7"AW PkA\y6ȣiMj"¶,{ȋ>đ*G}\^i39q ~-.\[CO?oSXT|ppUmw{;lOdE[>,?ߛ١˝١˝sgo:X,㔛4 sHjf%ӆ% ef8ωt <0A8Ctuy W%cYcR@oz2ҽj#N,ެn,Z#<|eW>x'\WjL$GLvR`D\c) z;$dͬ8 =Xl3aw5~O &VmSmi9{7'7W~YwM: mv?/ӷww[hR(ywU1#=,` $4+xOsD< y4 `]Xܿ pey 49AD Xph"O Tgȝ#oH MRxs󺳇w`F;cy>+84~ЩZy̾DARK]8YWk9Jpzib^h`klx}jYvb[ks-ǹ6׉7_4 Bbw?:}濝ur V8߶<^z|53b-\w/^pYCZjdT E K~$ե<9j/CBk;a670WOwq2ZYQ!ު/Usrhb1ef6׸ |9V7o *%aHDWŃG; ]C$;/״vq>VuNbd-:>'pyrgT̔ʨ ReL1*/Tt;dNq@bm++'ef&2C\,k6yA/6٫6i'mS%K~[%e *{o̎0}w!"Ph 7|4/`i<~·?|u<Z昮xULuwo<ۍu:>t}h݋yΥ}R4X;Nƙ3@˼Wrw?:qά0W\/:w?6PPB:cYiS*.)~sWbJ ]+1,ޕ(weHF θVyb(‰$͍f  $YRHcpٙw}@Ai7eE(-ۯ\rݸ T@ CWy* v&F"ɭk&9R&%sbHPB#@Y#'Ё$7t=Ko9dHWGw~<m0e0 ؝* # PvvY"곷0P; gm WyL4KM,+IfIPӃE I5R0תm. }̉ KIXOC)-Y1![jAetR ц$NQ6'[b-~-ŊbKkb3"m_]i~r2Df䅌AX ٩Ih %P?de$0oftvrQe:F!e~+…#P9, !;{`^Km(gWnB*d2w+TsPBc"W&ݬ#VnG9˭n'~<gڲZ\2HL*8)jܙhwG@)~u ofO5ʹ44 !`[#DRhDI INyv3$o`, ՠ FjlKՙMn~ВՂjPHGr 4oW~) ;~urIX4v%2on@RiTvXً!-K_ [ қG]_a 0z=\v m$Fox7_K,>9tOawۘ7LlY0 ~8̚롲\zXm<>QD$j3JW ~>6%h5ϬwY)?#>8Ή()kA*|.?NCsL$4RTgJ( (cK;~Qؗ rCۇ0Pwl׏yrx !(CS  $ Oj99$^I`pG}lEP"#E WOwE'"Tx>[|q㣇%c:edJp}g5b:6Fe /tsl!l/*#Z(gY[TFTFKDSh ϵbQ" !- 'BI0 x[Dϒ!HIJ͟GK0o7XVߦmתIQf! ,\ R^q:0A1ݹ% OJEųI+%R)βn&$'@&H@%cl` V11D~`FTE< ϵ3DI$$DbǴ`$\vTvfF}@Xٯ9)b}ߧǏzcw!7"+TÌ/yg=$(%8/EޓݜlIEs EDӬh}a W{v2ۓU)Apd'`Vnd\}Ø vSrg] 7ڵCRz)Tc)>{P2fYyGa{< s]n{q[DF1_kai@@iL(iQ"(c,(z^; ή"rո6X$gN~נkA3@bP8HAAyR/l0; NLʥ(5藊L\_<_Y_D^$rc)޷VOwnm9&%?rD/߾(vJݐzB0&m[G,F1J^F'UQ|G׆%?ČDs`A#3A :nۘnߥUqܸR[R\a7 v7k>9>n7z>_s<C|rK~y }t{%Ur$"CF,R h9Pz*}Ԯ߉FGU|</c)eM+.($vgUmrEYX[D'@cvXoP;,:#äPkD}X) #h(vs6W;qud\bwGJqPq>ayN/gdW!ضv]kr!-T;ksCUUF;=ߧsDzQxmݡiTA5%` e֐h< C9f95.Ҕ,$8k&ٵ1,g2|$e)4̈DPBE(f_P2Jf-$COMh% x>tl +k-^wb!zsu ŽxHH}((^dbD[qWvq>{„ȼ m?+cxP[ZBpvyXd$R~W-݌dOcQ/ %1WVq֞2SM1mh})vyLqEb+HHha80Q-sb3@aD3Job)KHi `9A PT{6=?*aYFB|/8+q$/7{Un[*7"1~XPZ$Ɨsq !gя~ȕW- ܺla(z&? BRJ']y?4<Φ˟ݼhVT/=xܭs[c+w뱛O*Tm?5a h#hPq^$&b6<߿}hN?c &'ٷLdADJxS:wx`Wn0~ 윪RJeJS&\;(!rd\h0&Z0Sɺ%DQ;1>1e5򬼀*A&`տN5e ii۾5:mȽDA:6kKu)"X| k,Y_^|''@>%iq^h[|MJ֖vߡ$GdQI]FH<3y!gBNF @|ě5 g@ZC[?mI_'ϖy1 8Q^=B%m:0g{ |bKqTw?[4(?kR0bi𬙫("bP=Q2-͒1pwAg?]~ݠMvmgC\ЂRh]/ 諝*~r/h")3LJ$o1!滿﫭NrI:V'j724V>Y0K/9.XA(o8Q9Rd4^GET)n^{1zss<0}xAnTыѻݎ*8ɑ`q}cӯ4Twޞ?R 0`DNpX˔I !^'F(LQV3o"9EZ}XC!븕Tm,8(u{#B@J.~>+C3KM+ n[- gBH1Ʊpލbnlp&8bʮ[F`8%nQK -D*ht< :"V|ڣLIN#?'ce@u*mzz\lݏo~򈄡߰7￿h4]U~'~~zm:nfG./Qr!/?glzO+~Ͱ}න-XV>xzt;1rA"\ mKpR.5jJA[Zsc3pc9;:&EcTt6ZȤHT D/yb$o+Z#iRM 6D 29jsYpHw*j"RE!!^^.!\s}Ia<T *ԃA7wʒ,*(ǚrom|yw-vF 4}]k{*N/8 L|4I*G})(_7؆ W#\udLUG;ܤbXp}1pI| Dt=w:Sib'!Ƥ /jG0J@(\e LP-'I4,"~qƳq\9O!}|A~g?WB|I ri[3rۀV|cI!ḃweh=G2CY 98~(*BkS |%]nZ INHȲsuY&eDH1h",q68NIECZ&xDXtF甞)pƳ$F*:\gnIr-ws:?"/O e \;)VEI$uߋ_lh|˂pB AmD3NZ*Q jߚOk5QfYw~v~cD4Ә씅\F4 7%KCMhqjeW=Y4S_.Ԙ8))":@sL SDxtfK%RQ[45a#)@2/jZ#$dEjЊ *Y>'cRpdNtTgZ^Iè \nd:B`9l8ȵUp (:ᚢubu1pNĕ*Z"m4~qJBD_G.!񕌟:+U#ZJnܑ# w M@[q@irVtP'@vƠSO" ж;m:by I-l9tC Rev8ǵ%|-i0h+Qf]hHPTBV =Pɞ#l7@iF"b *Pƀ$r#^ Ԥ$H r)p :≛KJ񛤵) sW +v.@O\uqk{3W.fscOVAr0ݧO?_i1I44,1gXy];mUuZ(R2-=,v9W_D774!?/CAKsyӳ?8f0pbDbVB!MQ>q$f82&aYń 6 Րޭ6!&aY 4 sUν;iփ{q# 8Ƣjt,i|ҋT}Q7Jr0(rEʗ=xQ^ѱ)/Jj+/pՅ'aI&!peaCR֨L;E۟Stzp_d0AQAp>-wHۇq Jhb.[l(Z#)*\Xv@+=NB~ˉ2cgtcFl&7Pdkѡ@i\t݋Hޮ/= Gڨj]4e/P~뫅İq\ R=WHF;H7(EE$3A¤TJ4uȹj-mv˝4IE/WϮjrfWPBtY>c{T u|a(U;r(&"p/!.UџWήfr3A과~[B͐u}sj& S;,JV戉խD{KFH`o M/ST{Ĉed^4Gp98&h 3h*e0yD1:WH_LO/ {DcB6WH MixTE{R5mJo>,50R\T@k,H": #zݤX%%vb_֓g3L[BH5`nI'RwKo䑖56 ǬMs.Eh! ъe/NbXx8CHR":ۯe}]EtSTJprwY:1Cw45f bo7w1Z)v>Qko>ixeOljޗS07/'͗Ruls|Rt}II]J ywmJOzҖo>%ѬS~RKbr8R82P ,$^fd0xzWH73{@ND >p>FiCߩ4\ n;RjrFc po4>wJt_M(ihۂsM^aQ꽅NæLRD2Hc 8M+ IYtwPb U RdJyxQKB h9cW~9uGzm8j-hᴋҊ:f\b(aĨ X\H&wV!@ZƩ\w9FEg$,hySJEFCN8E-<Š %{m#-fU-&C!g ݃r_.ݤw>{q}t; d p=ANuvɞw^]G(.`)0u~2u xKyF-*/Qy  G4-6 9ii;KΤւBI4 3k|cd6:QX7փ7,۲ A<ނL5_')htRvE,}j>],=b-UY=a|4تstVfs)y1jkk|p5u;ן>^%ӻQnt->X2?:IX6 ͭ\էRk>`\J'yd61.Ak^ZeG'Ah=. wԃP) ː6f3IÉA}^1*Nf3g 6~n_f}{V#5P0S0kdܺpk !dWp\)T<T y!ҢTXς`"J_xXP5noWռ@2d':xn k/p .6!)A3t©@|OA)GDUj"'řo@`YL,p$HMYܿ\Ii+W8e̲JPp5JJtc*xJY RT4P9]*AIJ[[UZ3Y:%;*Z*[,PLe2+TSZs괒uĞN+fKLZNjM@8}JHp.j^mz2QQS|;(*CCܴm!)O>x|zBW<}.~)B!\,~g_6"#'ļLōEhVFpsP1]q7E/ndU1RÏ~ hfnF2u(t%4x,<Q+QJE %uZv~h( ɬgRJGA@jRz IWuOwHՕs2nj7t]JI@JDZ c]A@W.RZ.G?<v oi]]ZrRcxZ:P(2"+E)M0G֕ jEpw,q ,Ժt!=R)G:I$_qqkw.hƱ%.݉ٱwȜtUh}xNTߜ,C-Yl_t@>7~\+b*xW|8ļUۇwϗ[ky<2gۻJB0o pb!ը}E;[R°jqyr+mWe'qI tOtb5;*שIʼn&`L'Z'.B(夵Zk 32O&O'7}n~u3K=6$҇B4T?VWP[ Gon ؇옎(^W C[]E |eRЮԱ,VJQ Nr1yrrl Ϣuk6v7pmPg1K 'n_^X ?<~>_Tm.h!@,8롖JҕZpe'ckw%ω/~2~n/) z1q.Hc8#%AyV*Z .*-=P\`{*Zdw^HP5\$.pz!)𮣢FXe6 }7=IRlnefMqD#ȕnfG4ugPy#o#-c ȃ}fzkF`ȉ o幽I{mg4K]ޏ %Xrґ& =\7G=4!$Óʆv-~\(: ԃT%EV2Uf8QYiU6Tk!dЫ'1^ą?-| f"n6.f"n6.ꛍz 23Ak4Z]Q}C.QWOcS rח89.=l,3eb;*R@ǀS!*{c+^]@i28 - i[[ɯ-hd,hg<&֮+z֓fqN;ZU$ړZXUa.G g']D%8.Qqw߽|ۏ}GW8$#-c>R<>@ m[ 2 ޢ+`62Hnr!9 U:!В.+\'ݷ ! uWt%Mk<>xD9 L~zyul_(DSxP8yld6gT2A i4\OۥޥR~^5]b+ foZMz; |wEkwV?}HkFwʛ}ԇeȈ<һ۳; v#>>+EY-38Dz[][v!:$}8uXqwKweeG'Am,>VK!㹆ٳg@fˈ߻ZC810b+ JewQL@1|TM:K-,xKuaip[T c Vjʥh[|sq#O|/Ϯn/irzytW+|s<(@uvx }:X_a]C+]F!|xe%R:-Pj+)7-qYZ8 _i*+%u#9zW} s=b|S>`u =Tl꺒kMQ*)!96TAGDd2F&R)]l*p\c(1JB?EwƦ)Ѳ¢T e))*MÊyZ^(_CI{74~UKU4eS˨F]DvD#.nRę1R*Ï~ g4#P] tYnu{ԍZr~ ~ ^HA樂yxye1Q G|Bz>kzї4ɚ< =gqR5ov6BB e} "VO;&~wEN{nve+ b ⶪ. FǍ'E*&%\J*+Mq8~ኊ6VR9ίuvcz?$j6h *ߐVfv[ gi%hgwLhs2A<0K8;)l,UߒsW}ϕ?6Y7M!ПR:nw$l[1&oy[CF$5i^* ޮ[m+5Nte 3:RT1|X)NF@Mw:yL ?l߷xDeЦ]N 7GMgާݰ *&a۴`^->'%/sSyF*g6NXP-y]~/6v:e r,`G٣O>/?CU G;"@ T1ڰ]Q-5^R(Pc0AB&Fn藲~ZSe)me9P}e%$QGE%Wf7x\"Lp3[zA"I'Vn6|~)|sXhk;~0mzbIХ$O !}vwem$z7@ʪNE,/70j8(&4ŭ)x0#ߩ:us} &ɗ٭ȤSXpwE! HDy#Hym⚉Rx<8։ёGGQS1BANH٥uHvu+#kW II_Keqaŕ~9AEˀJ.X*d-ūI@ɖ}:RښZ ִ~t~W+;\ ڪ+ڋ(ӝ`-Y3xW.{7asʴ,N .Ch,3u$+nk],%snjkkwb'Y#^O]~7<wـNLyٿyLt"1QMh ޔbٗ,r7N7fj<~|ptK; T Z Jy$?oiHXPzvƠ^^*}h=9-Gfj*6T j,l5Y+[q !.aˏUiuvoxqI}Ɗ_𧴫!ޔim֛lw,>J8z:s"09HHŭ Gfj>Ɍ0,˾i'*dάAcs)WDd6 ˠ^ʪqw,o.p}WQœUÚ6Ys*X#tNX x'/=zW=9k3^nĽ(s}l񤼿M}G9pJ(i)=tCwOUml6g$ P:)(zSWuRv2%VOrZ*>夅Ak\ 6L5-m0RAf\UY0 t<}plDH3dCl'49h I w3[xyVvHoC..>҇HmVχ!LU#J}/ ^|Wf! fiU:}X!3㵙$9ߦ KufV3\qsxwWɖ(-<8i-Te /{]yw\Ay]Ap] y_p ~{7#sw7a|^jw(V'Y-Ltk*7kf?| ߴ@"n]MK1;d?HSKϔ q00۫+Z B[xeM4ғ\mexd8bG+zƣY9hG󭒺wyRwc4UYt]j5@ן&nA+]L]|8 3LN]Zh>ܘiN`\L_%޿[07_'Q-Ľܧ^<+jX&b]]7?N&/Dƛ/W5kP9hq>]$}mI%q1A Y vA j_9w_XlTKPT}vŗ.9Xh-dqgmH\"!4r-&7 7;Tw=XVej8YLY9fFhy=3%G!M;q9z44_J} ,9J'hs3Mn|KՅ؅_214g15o4?~|6]55FƧq\zx%x]Pqoz` uCkPUV~4;!'Pd}z-tpE}NY}p @w->a.s].&\?yj\kT O4Mh+1 ?/f̏WG#z2;/Y:u:cPKa*xz$uB͹dZHPyZ`Vg+.o 7>̓opQ_:8+ρ;e|T?D/dmFEB=U h$OO,'(['4 .Y+sxaRE\9TNW|2 4>n>]:$J)dV5QV"Μ@Q%}Z_HKk=A9k}{"T)qӌx~L, o? e{6<5Jֳdqf>z,Tp(:jjU=&&p\6?v'BB?vEbgF\5Q͒ZQxJ^ 㫛5! 4IpU'K USh)剾 M*P0׸e*DPJNoB3)GVYd #0FqlQf@U.r뭌eHCDQ <<؝Y2 V>?woGKH?K+dn.g| E^xEApg~?="u' ,H f8wac?TU1zd8O ĽUwȊSJrp+D9oxFDL6w*/EUb};@!Ժ;hd͏e}{yk`@vQCԺ_}es֨D]lE{5^48, [CMc`hS<%AɩcIÖP,dXj/k$P,C6b(!Lk4se}R ՁP\̀*IH_shFa9=hr2=|pUgOJ hO R$J3{"F0ɜpNo 諙 Ȟ-7wWc~M,ğOP]/ (% >*`M 'IݠdlԎc'9xAS$X@{f($e=Yϯs=-ч4-8Уz;*ߍ8RݧGa4g&3?s4 ~.Gc_oLJ0'͙4̐.} }5 Dr tJ8~OUj?21A$8I(SB!J'TDE%rd4XԊO㫫e4ft15.c;TMqM˄A-A,ULbC!F J 21~*>3S,wנ$ߎՈ$٧Yhg¦[] ݕB`!-5?E&p-)4;6c?n}C^ڜ #ʸp&Q|}a`R,84|\ 2uT7G5S;Nrӿ:C`!좼=SNYKhIMoq׫Y%*>q1; 5WW2~5#Bu-_<2'<r|]Z*{ q=ކ!~1ko "ߊ|3z 5 wH֫G 25yhmAe;isNzrG5!c7Y ͇בJ-0ImHD+15LYeC2hP%HHicB!˜+oy,4TDТksk spLW NfItԊcL5q۴;ّnyVsjA(o.]Os2pkJd%KĮ}ԠaҖ~B>h a.%*Dd6 ˠӊ8뤽P/Dt8I|'`8j=4JF`5ktvmf/  NG:kh!IYV%ZmcQ-I #$ ڥC@N<iPT$@šh`t„@Q84iPg5C3G&ᰦPnY51{%^%^K0kWWD8{BZ (ޓds ȼDdjz7^'bU0HX;ٻ޶W}sL{$mд6bE$wi%U~fIJn6)cp$r}fvvf3!0x/2qa4`χchzAJk0y kIՄ>>'m cЀ&@%7^/S &\Dx#']˸rq׮c\(N1t0OAt6N?o^gQxF˜o)~,fk9W֪˶_(r7(d{cOi8OCxcdN+} Rr1=L'`f]';I'e +GU#!"ZJlEI">h(ݞTZZf0?FWa3Jkm/T;4qN秭fm*} cק41k~3`wqv55Ļi "<@Y_Mh՗8Qib/P+??IM?׃uc!B6׍c,X*L4EN8vJK Bn(;zrUXZyͶ0r/d&1 •N;pƻ;ųEoi *;R/gW<)Nw . &Upݢ{K HT+tc  R(42X(+Y3 呡WT>]]V[ $:]y|[1p-RXBB '\Hk5\P"%o( eS^aP-RIUG$%J\$*0#_[9h"j*Q(`D){ -VЦlT+,5hjX`51 81aG(FB׀2ri)C/lcUe)p>⃤PfP czV#͵H@(XCS,kn6A"KIs0 #?Х[H+Tá rq9rgg ]Ya( ZęฦS1(HEu)5:>dp+ӧRQV $ZqRf K{ca+ca?9O2cC%WDV<||F0mPʥ¾K57E\PD|#&RXGJ!!]> &aF 5qH Qу}`c `X1Szf'AKuk,7f#\G $Y ifRct( `b5` S:?l s,PP#yfLiEIC51Ohۓ1Q׆i98.n &n4 ||eqISoNqd m;McqoO9tNTdpT%V2`3jt36b6w'g$H/q7'[LxLiŅG0 6&g[vN<o(̓ݹU4=vQZNoźQ}W*e2dnYϲ3C '!|?/ Aj}8@zNY}p}1af3!vV)(ƻU1G;ٴ~; :? oG߼q2 Ow?]?9O3Ug:O ߎ^U7oꇣ8Uk'i #ՖyQ;Pp-lPhmy Аͥl+p] d|Z3H\0$O}uG13~x&M[3*XJ `Aqx29pk4> v4ػqw4̆z,oEjU{{,n P/S|;.!Cև=Wᇹ~uiywL??\__u}~y?߾yv+koF-Бbz%$=vkZMb|/ѣk;bK^f&T>&2`AKj엓 L;gh\uߏ:p뻥 :]wogB뷟~Cr֚D0^&^ %Ozg ^o?{ Sq8p?dQ8v_gx=t.qGA@ cQQ )s'08kߎVwG6|T,ySR5lJxFN$+|I?Ts״/eæŘ;d6xf6w?Oĵ]gsqd,ӹyK%i]c:L{lӟ`•&5Ys{[ڏĎŵ3b/S 6^S_ߙMLLo+k1;aRg(Jٿ=Il9qc<1}_c|jzfq~vg;Sdg̐s犭3vhH+x$ء!.'2GDZ* 5bIbM\XO"=p} -Rf̝ >6+"H+j?&~R*2*E(F! |0X.<@RXjaBG +oy ƕZΙvcJj.YtBhFfsm?Z1f TQ䦟1HǏ+t]o7K18'^S+`1ڟ:'ylb_f'kT ֊SQ-ī5ҢQf_/t2jvٱjg3Fqy(GL)5`tnjbŘ ɵqc9ōs5Tx=:T/-͍L̶9aitDOJׅ95aVZns-k ,)+έxht[a,$bl<5]"U cBtiLvQ+ FYwjG6:LꭕNZТMPS{)@. DI8FJb\~nFOۓHR:5H`j;?>5"^OgVo`D󬎬YrH{0 ).> ([7gdSpE-y,G*Q'3VHr)^'mqg0!kQ+m4jEu JXVIZamM'sql#XL䲘)oX* ba{;3oct6=Z%S;̈\.f,Np´+5 7锗 2I9O*QQ475qzvpm vM83Em+knXܓsiEU~БT*G*ΩTRaȌ($B̃#Al/W(%Q )tfx8:$-v&)sUEd܉ct^PԭȚ"uq"gvǝ8z6~syVC!S E\E)BN`BLFdqyQLWK|ojE?ҏ`z!XiJϯ;;,ŪL:t&fiZ!liÛDe" T 0saqmr$SE`"nų`DExܟUi%`18qہ̓@3C u`y wq|'Q1|DZtQ!y耹i=f_] ~Ἓ%=z`1;Evn3$~ g`[1'F}xJ YU"d.V$uy˰P޴e64nIB yA[`}Zo(tAWn[Eb %Fk S})PMrsTFn,SZw cLX{jj~& n (Bm@d >M&?*F5Soe B hh8b-r] mWml5 WP yG?Y6<дiL"b*.&mMhg*ǣݎPs+ S= 8jYk9z)x?zy^"3Vsjlܨqdk\[.xvkJV})e OjARyZdkβsVT4wkEyҍSZnmh6L>.hRq/w{aTՆ-lhecC "D4hSEs5YӶb*bh 57!.(!mbiVSm(L(& O%p`-I8yrҦ1lw͍eU'ێ-hZ>༖En Svu &<%mϲlKٛĘ1*OږiJl:ȲX܊3.QNNps?!+"ЩܨUuZz.EXg{>֊ K]`pVmժ=c*D~\(0p&[}SiQc9;NSuY@{ݪ^JawTRV`8dKG!,|4B yHQ.Ɓx`!J/s t$$b. m8c=懮n/E>_t ;{ny.-RFJ}QH"A<*PsvQ'PeטEJpmmvA6[h22tk˾0&^B%`X$o\(rFnfž[͢Zp C;f?avFܾ/ ÇںU,8Bnd2Ȫ-Ve& Wqxy &"9Y9^|\\%TLdTger|dNHZz M0\0?c$ 8æ&ضB\T Q ɬEVDsỸ}gNJrޏd. QIvү "{Sk"BD4' "2LDI6|8~A9[ m)X.^U)۽P=7/ۡJINhH=ɣjZkK=+T ;s@cϝn Ig 8/N=?>8/_+#!u`Nţ'1zDX5=.X-LTv| L R.31bUdS*TMل\Lj*kVLE#gq7=$Y>!|~ !vBI}\ ""oXL^2fM^#a;5-u~UItM_3Cy1&)D|WL=X!GO[o.̚w&։䋙J_1'6q ";{Ǚr׿OM7ywqWɟ?@:t{q ??xo__z~_IFgӿޚ/ݱa~wu7"&Z~~s}y}O.-5wN?`N7͠|6?㸜Mtv N3yZd<+~{xykMD'3;cIt^0kIf-BeV 62Of`|™ %%"߉IbwN~|&U*f,GP&0lxJ72x?tXB 'dfaK5ťxL/h$_H?L?{ɧwf7 l6XEg+7%Y#!c94ȑ!g$9V>r+\qfX;@Vr[N+m57r:[n) d=;Ű<%`]tT|kbUـQ~!|θ՛056({ao/IqK]Mo|a?HK &G]&g g;`:J / 2rCz؛i%mDfL \`|{]:^ 'Y(M8*Gn"|8j^BU})*m ,v }YV4+ ^cA%}K@I[.򾙷\p%D旀)#nFyg6N 6 Q1k. XjU|4(BkPlƒܜE狵b{o*|]mK%Uf)'3\l5ᓙ<9z<2PzݥJp|tN\x &CPE xwxJčovE腞 RЗZBAB }ńG$־/D\/TR}g Wb=wuAn`R$/]цKS*2x,\V5— aVQ]R>0+mUS-ItCN*Z[X,h=7ްA h>jtg0I3u6WZ\ixAb[ yA04 n0TB)0`FRrZQ]ZV38@" 0 = bBA! R:H`x7f L΄`('8\k-^RU_l hAIG92gwX̃sWBJtK ːQZ=B$HZEs;sF)2d 5LD]Ko#G+.,Όha` |vPKQj=HdztndXEddD>%bppF; Wwme&0"(sNZzJZ-i]֝Jznm].ʲ.;)Qʹv9g|e'A"gе_T}5}bbIHb4Hx·p<|q^(dMv8 EQOpØ6rܻ fj.wYA{S^½ь"lluaO*[LO/C/V$BDIK m@] =8S9ZﱠZ gՌqm)rlL2t^u톢~:+Bqf30+M"NNxUq,rIuޫ T5x0$2n8.zɹcIpD"q]c'%.ٍʨqXL 2ttw6I*V`yA8㊲ }#ns aˀJ ۗMwJ uRaP3JI[N5='穄ړO>,1Lh>?zde#%E({ϓN@Ň+";͖-î=TnvG Ϯa(=Ffp߱}2/?S6NxйzPˢ- aRjvU\5~.^Ɨk2^zT13N@JF;Y9Y\-6EǙ >H>ZU8 ge7V=3[<v5NV%zѯJ_v3 Ϳ|9cL9V?f4v\5M0T%*x8ּ)Fm?Nʌkt/!CZNL[-;"nf%tQ Gy ı̾GmL yOIb[-B9&lrUP.0̉O4g16& nq!L/MET[L]YyA 8-Y[&5[qSM!J=cKtp"mD׸ ᘈѶE,B+MD,c P۔; :z2$g8>k_}v.vDC^DDi7 o]Bx<Ζ˛Ŗ"Ab̏ bg4gܾES;w ?:[O ΄ y^>?ՑQ K ho>$?L6X `o?X6F~dA*1+KK ~zW/T-R+EHѩd ۵6^8<-{[/ %7c'L 'ße=-}Z:2JЍ@Q=9{}⠘݉zpHp3ͿAwJq cQ6ݠ]9ƽAQGE:PeчC #GWmo} {Cz$F5(8 4:ܤj*DT*iCZM^X/)L >9Ymu[@+KJH*(Y3do_O WVʖtKVʝ.+A^!+KJC4]nUldmrtJe0݇ݧRZ2/;rM-0B/T%(AKQW,}R.Ls_eܝ7+usB<.7 մ.bDܯc1cs9W[FfJZXxerr:}.QMՐPw7{W۸{V 09Aps =0z5eGgߢ䅖")#)vں v#}zo?YtZg17ԨV|%\M|\|L撷?jMI i\#~*1%rT1dಉ3SBМ X[c8?l?l!w  4[d<k]_da# et D=c%!\ݥpr8dK|T10\wJᰶ"eF{q>rSQ@^>Ǩu@TV4Bg}[v+Kg{ ^%,ME{L-tzEp;-n`A6(q&xHʲZ<v.KM!>Vt=-β}2 .JH\1nZNEJJcq$?xL'_Z%ύk"σ͓ +\|kNhK!d%]ہ=BQյ hJNtOeY3H͍7*#jR-Zw+xf0*Pvhh_ ͰzpPz=&yXF$d&ħOuqËbXW& ӍiSX͇^ΝFgQH7i.p5s6By80W^DzSrQ/1PgG wK0go>)..>Y"|g*>5]6ͥG>8/f3^Sb+OuM"B̔KT:3%qTR':GE=~5OA8 { (x`Fj2$EAK*-YC#&FQV fSa2hd:GbJǽ扇2"I9k3E#Aʌ&ATog]wqKfUIJIuOexg./7JaaF{j͋rwkY`  ez`][%Ki&.XyUjmmXlNWԨjqyxW)eLgW \iHC[Ř@vAL>VQI.~I79?P43} 'TĴNy]K?! TPz ]33}Jt'gX :\>j Ai\(,l‰*A}/F3^2 C$+' HWVI O sP(+j=G_7SقY(yHE1TFD 4"!j w@JDن1ږPYbXL@ sۏ~4KreoAރ!3᡽Bl oò,!Y}x16=4bɧqgh]EHeuD亹*&-e20-W%Hաu_п?g<@*)y/O:ךn̯n OTs^&Ŋ/obi@*%>x  ʺIM;Gużr@+5O79D ʢ9m ny&HOClaM[C !jPIX`TD1 ЂҰ9a+%HLJ5Z ͛ 7Bv e-Ҩ1h4tD7ktVd` h pFfDѮ\Ca~TZs[\DU(| ˢ$&ɒ5yW f}0u)69@DWFr%\Tf6.L 0+DB> җr"tc9HzPJyoI܆?FYrw0g.(taRU>|g2/ ~akVieT]tj=#,+KdeFYWR73C˟nT  dQUYP#6H10Uj=mXR@I*<(Z0>DEsH' aD4W]+xy1aЂPՉerB`šZ ކԱBշkDC̀!h*hgDEh&i' "9']S)'{ͻi5qV0LCU61Н;S9Er`h[jA]44^D| IcXsPt±@9UsHR \vEɺnl{ Iʚ8-qK z@CWװͷ"Fd-!YgF$ e1DʫuUz<}?dťOd_sz迿_-(~Y2A1S.奐 u)sq 9Ϟ =G c('<\&9eI ,ZRAR 54rygxuoI (a6t^O1NIHZg]\|`wwgoʱ˳xW{7Fy./`>:Yh+rڋO2vݳ+g_Oz~;sDU`r~u,!t=MBr].~_GW9(Xӽ8Wn;_~d@33(t 5~F{pq ,2ԁx{rqEmv͍o1 ؚ_޾q_>ʾЌ'{bJ:d`&UhMG]a( sIj-ڴ6=6=9Ap3>^<t(Mܜ@͕8L65L)ߣu\}syx?yR2Z||i f.ZNȵ+͍^vNXBpU Et9ƒJ@TQ#pj-i9&D2zk>?.$zSӋBI݁ތ~Hm~"b4z5_9_ ƈ8|ޡ!WxTsi#ϩDQԪr:j`0Rg(JKO<~b^`^(=?Tk~|j$F,eicJ1%W_~Dz1ٻ<8E/U~N5lzj \er[jr)۪Lxme*  A,YQafT s: N2!s4w&F|o&2cꃋqiQs!CN;ՉzSdW *TQ1Q( IOg_T{D$p5sԛR>Xo=rai(?īӏ ZvGW\jZ?l]WrqX&M0yA36']vKVJ4xsukÁsO-SJ)ڵ0T6SET{jA&arZ$ RЂ@ty~AXbR\> >2 I:J4hDpHx/H2RyWutC~fFOa[` W? _Vw/(P92}~|s|p+`gOjlv77>D[i#т=~=~gBdREo;)h!w/yRJZiIo߶XH*^ׅ}wD2^{sOrp݆Pҿto8mcJSͲi#j1M͢˥!.?}?wl$(JZ+ DFtY3>@p@LD*xC vlgJ`s"8V0Lq,j4( Ih\ύNh걽5o;_mè{WvThv|R[~[V| Ema3k6u| utM?DFP§z-&(=U2V`(Dh)GT'VJMBTيWrtf@8=E?<1y?hr8k@Vէjt-bC2s1l`k l@~.dL7])n=\1`#Yk0-WUk6(hMǁ<~C(jtV\Q )]_\HpB h+`.5āUݔZ(Z_D/fڴ68TZkq(`nBxd&)IE'ZROp$`hF=٧y"^2h*A*h+ ͎n&ۆ!IPFSȵPiRM%-T>P̲C9~g%3PBX6t,ErGk (V"~# LFȊ!'ydtԯg?L"J ߔt8NYb %,G=Ox_,"<73K F5griѬfh@E%uJR~ FJ3n%b./ ̕Y+4ZWQ[YU@1h* JSh3C*V }IO.xlEFE,Jp+vG)DUh"h|e Ĩ TKSeUU@RvvQbnSԯ}fЇUz)FX~6` O7yɇ*ff 9)IRVaHd䩠=)tBl_9 h~0IeށG5*}YK8XC j-&,CO*Dbhm%Rֻ$RF>84uz 3ب/Ň[0f77S{ۯ;i/h^WڡDԣɣ*,YYgy慢pJ;QѠI@5C6+!w* 1O&#U͛X͂:*;TS+=bp?i2@6RB=1FU֛MJ¥Ҋ\cg_*xG߇nQ0>(uTBRrMySB*(RJP .]d% ]F! KIT&% m1X&})x Ռ)M!f'#%|6T_@N3Ӥ"i]VqShRw[HWF`ҳo77,MϴHQ :h^HQ)p?͊^\2lġ^\%O_ u i"Q4Thr1R4h#khm %(L[ݦ>5S{%S({qf30ظYUЄټF%쳘\"'T(Қ7j"R-DA0q$ɠV6uJ&@)Akg͂AHEZ$7&#@RHJ%|cQ'5 HH&FL!hµx:7% /'$IfVz P;9lv3#]~ygo^*>"[_3(0>v7@|x&r}V>A74AgFA81+e#{}0U2̤<ϸ=ZYV_K+>|Ѭ|ݧzO1XFݛ.⭭1x90V)@2`׹rTADHrZda4\V$]&V7s`ʍnt3tu~zZ5zSKԞ5.YYIa;QiazsO| H>dg-W/.\RiuVͯw텔[iXgIFDj0:N^T$=eg*4ѕ{)h;*W4K"p j 7d3ͬ%O!dwL%HE7HK'0m 0~ ^҃ T"\ˊ4[GE"5dz Oxw8as:Z)H݅>WL ?&}btX=zrA:pZ7uT48iipSIa\wZ5Pf$488+p 8[2n4zmKG$'@$>QZc1s .dbb.A;V7!(CH%*"8)L[+?%̡V WI -0,Ji#SRu"s=Ƌm ٹ@,GܡZIμ^Xy', | \Z=bROˈoO=oޖ|:5'-xZJ&GjwN{ BER'=7A)DkTҁÿ!: DEt ]{kwCeNW\C#uJo>%B4u~2^Bj6be"ɤ\ v~qsZܲ.ů/^}毮ffW ,f싂+`|N哗,7"0[}X -4Gxb@)S)eUzz5ӻo cW[|r/ ]WwrM>\va/ޅvrw7xs:!׼ i K 7-]»r̘֟O ~ij>]؎un#VZMnh>F="{0z¸Qcm`*!O`L::TNnnFqu7>r/_BGcWo7?c K7GϦ3oiaA6ZXĝfļ46ɉY%'\~S[1OS.J(md]$I+p]`z6 j[`MFGeq(6Gڳ9n=l&%Hm[dfSYDКq-Gvri8t {(A4|{9myO2aEvp66q ^JJٸtQՆԞF6o͉IԦP{j[7a Ϭ͸'`bO = O|``F:^YI5)h A4&z04JF`Tątvf*"kɻ#Im5 l; -~z=+q J+ \M&ݬWT۠a[ W7ݪBG)ҢzuȠ LӁ%6pQphv.1kZO鬻ZLG9k6X/(0j A(g$VZGJ::WI+żQ E.F>Ak~]!QrW9r۔Y]& f".=urgnǍ+ ̇3 }68ac !:fux,ŖF4:ž4QMYU.U{;?7oD`%5`7J3jn6FpaDJ!`T0+)7<.g! E!XkDK5} wYfL3J8†!dM%1wJ0FI) RU Xd"By`TA~{ϟkҐGJϵwHQ:JS6#PF*"*I9&rSJl 3JD.dr9!Ub]+.[u $RVY1C$,a )phaT N8.tXyu s4AȤ/?XH \3~ۃu}0N f5$H TmSVzRHZ0—U|~uۯ?V*ԫ*0y_~KNȖ_K,Pw+~df.hm?1oOB<=5-e…7p`zG l6xн«]+sۙ}AaHrټduF+:xhj"C PV:agN~0PIfx~Syd#F {WZ|w[I06xcC3xH{HNx|(k$?L\ʆ)٤ۣ$GMYOEr]THXĚ_>/tN8ݻNӽɗXN¦"ILLTDcΥA DI 4QJGX-bn4i#t lFCE҂5U$B'wcƜ^q4dHy ҕ dF6'F$VL !ĉ_4d.e AhQIhn6fyELK K#“$UDj46Ĥ6 *RN,|tdKf K}`Fn 6(I\Q֜JRXSb8K,fȺ+栾-<# lBQ$A IboPEvTsn P.N51Hzd+z^P?"= j+FB\~dtր/g7#-?u0ErIcv5ͣ5#xov!~ eG#Q,'fRh 5LXwAcTEi"-I(A{p9'1'/B)K2A< %^Z05qD:[c b/#("LcT DĖL>t^s_ 3'\'8' ȃک+]9}BrCBTtùfM%41Pb*:ZAhIAPUjضY!\~r1x3^uNw: !o>t-`%/H"&)>*l:R*_4Q?*W\1h#rӽk1ρ;=E!x8d;by䢠 gp|nXFB}Gn~~3,!st%=x_.8.|ņlt᷸AmͿ[9Gsu{>{QL@Z}:D(JP_T1RMn[ڻ`6:y!2HbƱp)a` ;lkg0-2|z50N'ͶMikڭ(d uVKdcP;h16кha)w3wFv4~`}[#npLI-SS^eYgm}Sr6Jq7>`Vu35O5Bd('!ܽF\.*p Ro1+q!+G<)DTL=*²D}YDrޕq$З`CC^>v/}Zt(R!);N!% g8c;ș>~U]]U]]!WU鉶L./.?xA3a|g~l?'Q5^,/TIc?&X"Vݔ1KVc>dQЀ(lYq0&5Ôi:K ӎwG֕"Cyݬ\/> _g%TH-". rBplX!r{NeMF4Sqw8BkuPk@vo%x-}3{!K-Dtf1anKTwPR_^-gJ)._O[VC-"F2er7M;6llEjʶ["qyc׺-eG M;tZQɷ7+XU9%o } lhk``+cҘ<9ss'Gκ-t+'o u'y[<|wo-Rsd'U{joa\'bgi[<{:؀Z2پD B¨7\y3h ݓ>f5 @ T hrcX>qR7͡-⍃|C, }XJ/  0'60(W=Oąh x{jSuM[a'':ĒŹJ,~O~hsDX} dEh_/4O+OE4\Z{rw@>+JVnkpM'CQ;ᇒ 9hYr1İzBJTᲉҙgDs8z0]_fNc_=8YcSpgxd~딪!lwɓ|e3/oɗik)h.2B3$k8HU^?*t_! XtxuY ό fPdJ8B>h5!ZP-yN(6L?xGB4AHXHHZ4!0%#Lp f3{M9Eupqxr'JCj`"!-g:-&.∬i62% PyGO(Bi QsAh.[vrL5N?Ն˾MUb_EdeCy7Ų 9Y\߅91 syϲb10]&S鬾 I9tb?D 9؀kx>{yHr JCfaE$@т>n& ezi4<("CqQE,%-)ߥ.E##~ml5}CDzGJw=rU8%H~S#@K'#c-(f, fK9P+үOVikSN~E[aךȾ<-= ~u bEp{@ȶcOv4COh15~dޏBOV-Dr,)ja)tnYT\f`B,:S?~^G@T bȜCCP(VZ۩$u@8/4ٲ DȇĬm/$F38FtC 5TIjs{E7pF.4Un.rqq4/2򀸯J hԇ5Z 1LB}\_%懧*_Hr/W `#&ס*spTon{,➺hT o=ł:K7K+R**Rd[ak%tnG+kU~O`PP1Jp~i%b[NDr/uZJgԑJ~&ab)0P;"{y@!%cId*ԐTJ-c=:lhb-T1vKk#A s#5#ʑ1h##! )kbKy H_xut7nMe4MK߯=ycr \+31W'ڙ:l,=|J@̈ήN%|pqh|"Ӣ!|k Kd K1jFӡ3?k쌏˰ygkw~8+BGrw,Z,K y.z@(Ȟ=C'W-'-!w'f(fuǿV+N Ux+[Hś8cAWȃQ> 9A^1fדl}EkIC_g?=DZ*(_kd!O}^,lV ϒ&ru>~2vh`g_/7? feeBNː8b!c1 cغ)8jr+鬛3T-KT^ SR5)8,;IoJ U‚,2Z LujıW$pᆒ]aK2wsnla>[̠XF*避Y*)e{W/?'ـ(\}|qYeD7[UD'f ~ᛳA˼b>?o0=@"gloW1Reߙ`zs3i4`L(Wk"5BeLūҍḣQʵu )&e}xq*$+Z7/G';2Nt\noDpǛ9ZW9BqT3(i<-8(Nh V ;QEM@E$庭0|1j,Q ҥ76*,η1Ei"@1R-C (`):D[صϳlLݻx=~ :]^h1MT {%;dМo.*\M9"_趙D; J.61Jla]9:LqE+dz 3(Ç*a2.#wQc 8j.@n@ఁ |m MJ"MLOd6{>^wm¿oU4\,nȎrA~m]d'$&uUO xq#3@ bQ$@7Lֈ m 2h1[@ךPT9'sA$.Ԗr/U^0ܠƷJK|%\W芖Q@ &.*°hRy"k`&CZG'<5yMrn6&\M?Q7ZqڮCg0`c 6؋xQq\.n6X3jE<R$KɃ1q\wEj:IhlK-C5 l-@XD 4`#V4 *RmCGV'tŀ-nY(.?eDڰ+e](ƕʐK|qޭpba >2l 3M[o09*רΝɹsФ5"940êssHpKۅk⏮ϰjr5j8[st\"P}&aJDjlGt̹ ~ 9ՠ Nh >BǠn)ՌbU &%`hN `U:<#jTIhӈ FL$4Z}K X׎1D$` Iƌ o=f#Ҩ9Zuwp}ӫv.{p9FV(:.^n:3'-'OZeʉ`epH0=?]e a|ŷY%~vӛ_&cJum 8m߿Xn }P4Rq? h3 hr0EV|~g̍\Hsm90ξ^zY[)G qgZ<+]vKG bPuQE=43V{e[Ypb"Ԓ6^߃RzsM| hVO7hh||ϕ)"mn̂1LRQ,wb_i xuQF?4k #Yzv_g+=|u,f#J0"yN1l{F9Jq脪bl,qo]Y` tyY_.*qiNv~c{x}&Z턳>MxwsV!C,ECS)6d%0H͂Ip)'(;G.=?n$(ոgtZ?+$cSmG F2vL\%UH:>m^?̣ !isu&UZzGLNNWp$zfLʩGV3&ͩGҒ5vZ&ikDaꆃOqEiwH]2'by;'=^GJGfHgո`K¶o/QnQ6&N̸wcA U()-+G7͕7J]G!Ǣ"JYazE&Xwh+\.]Uᲁa4Yf鮼 P[~jSB '5RY;RY &bɣޢgiGݝuGy]y`ZA;is] <}O2 v C hoWVY%uNtMT5b JR@|)&ia=SH)`X#uGd gԘ=U4 ๔M/IšK,E}9$kSqJq=A<2HDhN6;dJ3IU?{W6J_f6t\U*\v'q쇻/,i%9(ɦ%RƜYKxFt7@n pH \aK]vX(Ǜ0`sa ֤*&UQ~1"1`HW@1 l9U!l\'%˰q2DV̧n1MwAӥ)v;۵@\ޡJY %HXǠ!7KŮ ܆B"TipmH!)IQW]HĹDАgN:Y}&ousFu\(bt"p}KV[|O)㑎ǭ auPT>4idD6uIm1RgߨEuODB)vI1XoA q:V%$$'R3eƜr-*9Z'b=8|#OdO#tf]&'W#Jr#( 0$T#zEIcP ,?r*I(l(G>.DY H>uv}meEEN"EHʩVlquy4ʱTtbO\2AKE` EqkE489MS({Cˢ(eh3WQbQ=;MHSR7h\1Q6X"LW=Һ!\EO)$1(2_/N#/9v Ty \:,{TdߟKl N_ꙓʒgWjy* 0̃WY! R@XSMV|Hٶ؎g4HRUA*Bx+*$]}>'<*_+RPYXe0GzP&&";z.~+͖x0(Mc+xw,x`Jmnj6ULQFpb [X̸&jJcApb"}PRB 1Zx9{7ŒAA\c\02EIqh/M/b1RͰAZS;"zGV-ڛGsDO@ Vi>Q!Q$usxy .&#ᇱ!` { =)* iܿ,Yټ:ӳ*hhѭYcv4쿾+$Y29rp\$m_ZF aLZL9D󚶋Y̢}d= AVLRd6Қ]%eDjpږІچk!Z{K:{)s*+=S|JF1-6s%cH98>9nxˇ_]qN2 z afB"k;( DXBԉvB!8C Xx!a"Os!-^|/D5(-00FG@uqQmp'XQq~.vX04kG>O/gv=v˰g Ǜ(4fWl /ā"aEP** BkGH2 BT A(hs#;y'ûvp:6 x:X}kaf|7Li3~/Z -ћu22&Jr}09Xn97׌]Qjl5]е_D0PJxR"]ˡZ݀ZtP5yjt_.._x^P?|=z_|{ʵ667\pp㇏oʴ6FoU&D5SAP)24b @ŊJn3݇/Z*aJ 1hb6DX1‡,C4f(|D@ bR7PNDSBXIv'((0e:018Dh C@%к@!!qNUt c ޥa"3\b;͌#Sa~)ldq{!0}Ӄ~YMDǮaW8pae7Ki|t[wWCwJ ?bhd2^o;<7\<7mx?獀10~  r&Id?ؿtcK9*ñQeck^j_"3^tԩمH x>72cy8$\6`wtd)"tg)$졕zВZ H%OK0 z5fZ0e.CK)UѝܯD !?TS]HF$ bA@"HH"I@'I@|%2 9>ssxy?QdumG`b)09Z(x婩v05RHkn%-""'SC.9*M]$.D.Ș߮@=Ɩ$UDZzTBIE Pz2D'  DzYon#/{lLfg5Cp a*XUv(ldT4h:2WKĿ-`i4!)AeIb+H2*zsu|,iMɒmPQ}ߣT`g \U=)DJЖg9CcdjXMe?9XqAB{IYcQ'7  Ғv֣_8,GoC=(ɢSBH{ch:}J}<#ŞDE$e|;bA6mda)k,XkH[Ob -iD]ܲrnY?5l~O*IWU:N3|TScdx\ZRu+\X~';%j-%d9// uPR H0RxONb jLfRT<=%I:-y\ν~y*m;֬pPxB<`P(B <hbK$#6 o#F8rEu r9hG ]Ԕe"tu|Jӝ<o%׷?cђF gnC-3N;oDR[wrqW*zCܶǟ ~QW @#wF8%ۗ}v Uu oelֶg qP011G$*@AbOaI.h$rn53eB=ӭ/M~Ś<_Q4Fw\rb;v|(%ٓLA s; k5\],(&mu~A)g>;DU/+,AYWiq, m ٭4 ٻ6lW̴\b@O&\؈]̘dA=Mjil; c:}g:Ui&OnBV )JzKŽ5JU|z^ s/qs}:GSH?wXlJve-!8b5c޴j}i5V/.aeĈ(%E+.-"f~덛Wi(ZIT+! {K*dӨܚVĔh;LrYSƈ87OKIJ@y^02aذPȩSb͹#fYG3QśAjug7,=@VҢ=sOj.$ l:{ύˍ?aeLNoѼmFI'[ R0Fω F"ͮ* M]2Q NxV+O2jt(#M5T7vP "B\rwju_? ?y7vP âޖ!ڏE]k\wP$AgwLPE"U)=~ɥs)Ҫ~soZ)*>uj`La{ SR bjet\Ŕn{A?*%2K_ mt2h$Zf~/~"75=«lL%ᯉ˕O}Ce֐5_f ':AL.SJ\P#J $τ WQ;{k_?_. 0jzBXzan#-xM7j=h:Y}M\aF,aN:wqm?|;~uLo1Ofʯ% )H 7s#1iD>L9 ZV@1]z/>/fJS[SP)MD ƨV+,u$`؊1XB=G!\"3Q(A&[5^̜Bg0qFƣOgI8qKK YEQ)1BD"S괣 8P @k-g;JÝUɝX\sۛl#~\-ns8[wMP~k4>x-Q=|~A.>}a~P_&3H "֮!g/o3oSc'G^i\c2,?T?L]J!*= cψEHIƛ,]CpPE4'"uGaJd_.bRغ}OŴ~2AHP5e?k56yeB*A?=X0쉨هP)6$؄1"'zƏ#e 8( .pNHgCr\~ y Kl uBP/K4b8x驑kϕ[ȸrT}buNQ% _/ +N c$R {R n@Rz̈6xcUbQ.UH:ȣ q#A[! kM848')5J[_&abבE@@!,bR,rREHj"14R]%k הf?2 *TKH4Ǎ%qCBE'(C,&P0}Y`8 N<,>8q+-ֽFJO#[  A!eC/>cL/3Z=HL񣑬&v=5Mj{tE.Y^9>1$iHg9B]meVz)@RT@*Ҧ 6YvU!ܧ٬Tpbkў)R-%ǂzb5Qـ&0O'b g0:C}PU֯ Pip3;ʞsSLP:N,3TVAE` A!#Rw"옃W< +t]:،O>isV6z\(bDʰ@p`o8}!XKm "$P`"g'^rcw뼏fg}:u-ay"Yfެǫۣ~|?iT<) 8}Xi6|nijF49lн7.w-*Hsl+:} ihmURۺ[lOoO!q$us2`}FjNNR2w{3hW&8S T 6LjʏXeRQy!,V؄)4PH9uIOj*fb9aP#-xi&10IL< :)N ز},fܗ]As7SQه*?!c|zRs\pGbd|v;B_B\I ս:W+JE[diY#Zbd*r7vWpLufk`9;EږIc,5B`"X6NWȓzp&0J72gݠ "8 Obm u;yf̪ |ʶ XJ> IվSh;޺rx=6BxDt)07>XS3\i.HjDRv/S7Y_>SteMbN l_^_mng$~vUI{^ҨET Ab3V`"4C+D=.^5)#  r3o7<G>opg}6z?7?/xG'8qJjl)ԜlK<(Φ;lTS1XX7ZǡaTKܘU]%~~ZH#Z*̿} _+L4}?0oz5p%FO}'5vAD"Ŕӝe+`kT+2Dڰ.7f nZC LFn))NUbP"]|P=*RN~z͐`,0RAf1#$H&~ m($k ;#\iMyh%э`S$sa{w8q{fM&1x&yL GȘ#fb:jO3FAI$1S%G!ԃpSpv^Od:~k>^:Kse?cބ4]N ʦ5%u~3۱{N-WkC'zmaȳ"EK6JJFJ<`NZ7y?1S]ƏB]\' uzsT}>cxejy/C)D R3P`E$'U p0X6N许]NNn1AeXaχ2wZNnJ)"EWڪOuJ+pWH8[gP5b ab6("+$\7be#A2TyPέAPwNm"f(ՆmVo}DRWnktsY(Jr@8|$Ǭcs 6\wh'} ѩT#oS.![p8}orcu,"ԧ9F7grőW&]}<.^M'ЯQ7½˓ȗ'gHgwۃjqc\YΗ[;veX+&kNqA2!KnVܬDSA>A?{Ƒ fS gb$ݗs b,QHN栗d^#Q òaOW]U_UW`BcL檋z,߲"xiq_q1in%&HI3ݤ]ضԓd;p^ݝڰY+׸NNMk.+ ym8]H4a:Jv9whwgerkeX3lhVH3FsoKz|tA ߏi\b693h`j6,<̀Np9dF@%#$Vd Cm MGxYAe; }FkN~lQaײVOhޜR49KQ;zN(—|NGJKR )T:v{ k 6~~;X}ڜ-EC'7,W"tp<>O.dz"xOc_cH"t|>uYpj)8#j܋8:0/%Eg12Ǖ]ۈÙ냟L'Sa~˓*% ` %Cqھ"Ejj#&yaY^4-4Frrq6j-%hyir;Ȭ5@#k{)])d%S[aJh@bF 9'*)H2OrA=\h.o9n`51ubR Jv^Fom,Id1!)4A#[*;ZaXhbmrK@Nř` 2MSOaOz]P ճFx29OpvKF4ޠϾ$I٭J"QFECgg{. cKGðalJhw cZ`ຏ]tFˆ]vvUAr]㌤ؔ ]L"D2y l?Ѿ '=9I~![Dg `JbN6 퀜*4W CY=AK[B񃅠JQN(W2K\^ySA;+Li͎ 1w?2tz?GVagY?FWOwm(s76#%]͠Top/gO'ε/|=rMA~v5Na/E3 ;˃3F~UuQVT2w}]+mo `17/x+/`xKo>֯r/'bvsih}0Fe9ow9}4Zcw'/׬Al_䒯|F58{Y8֐lQ`yqʋMa]q- n͜BA| bPIk9Q:N 7E&$qJ`.Rk+d F]w:!2[k" !AGʦ J>_-dvFH8?nOO筐BI]Qi׿~te] D({+i(bop8#a{%:D JjӐk);@lBz6muKpGLg 6+g4'z?o뗺J 0}_a} OoǓ\ب&*1 (xV#_>2H%sVzx16^Ě3)VUĈ):n:Zb=NddV;-BA6J]L9d'0"vޫdM=p,IG\RRCɛ 0" 5QR2PY0—62N='|̌".Ϗ*ߎ&X>l';~U)fz.gU8!n/x+߾ nfu$/>-Sٿ 3/pu1\k/ 2k3ƱX8 {Ǔ]occr4F:&cЎvhi$%!ńYy)D8/snE$h hE h89Jus0pUQ'u:XINxg)`, /x@V1J1N.С#zn],g t32>&k8Lx!DK{24.jL tdĜPӹˆVtovR[A*:sWlQұtLx5;e{>Ӯ434RQԡU%h[f;r͠&UԳ2Z![K%Ȇ(sZ(/NbF %a)y >`@*ph3DQ'E?~(9V3{vrCc r@n84zFmG=fP?ι2ɐ[^B4"TJAÄ?hJ( SO[|S2U|ltL. 1L 3U%: }`Uh#r' š,{RU\/C@a.3/}.M,NX+X"clGSON\x΍ց]QX Uf.J!(ц9)X"Q#FuPئmdp*rvmm;z*5y7U+9kߤ}6o,!\$(\m3ShEj]AqQ[ϹxWCŢ+Vh99ӦN{4d:3]'5t]g)w} /=ٻ9n$W;Ma@XȰ=^<^jۚf&3YY0fA2"ӛ?חӯkVͷ/,TJ ډoc `p\ k检F40LJL{s ?ᴪbL GR#IG" I L$ B$)2TMf.qvq~_O{iw/S9rzljoQbf ]dϽ~ST++'IkE٩SiNBu% CZ^ʢzs qiԖG3ŋRR%[/+kmg&, ;46v_$BnDb54Ul ԞR#s4b{ WȂ58=bʹ2?r8A8&:+i[O2'Il~ض=9Dzn>DRg=ȼ XKv@H&U2=%lL9'򩗘Mټv]vbysRu9qsh9V kXi9agAy2F_[C3 qqӶcTj.X<½G` YG ߩӯYOp4]qJL0#`R?Tk,džxVytuWOxJ1jGu>!dO^!DB_=.ߚn: }  _]pwE~,/,{<ջ7%/adcYw6ޥ`}X(iV W8هՙy+9`B&U5,=]~Q]L AH <^7zk bj9}~ywsȘ H&2fry뛼N>XDiIÜgl`VQiTмٞіۑ7`aԷcz j,)W։|[q7f )?flƅ TiޛzQ@!Sit?P!JOF W az&c587XQ?H`{{'\Ty[_8aA`1+c1Hxyg l )Ή;XQθ%6x3=4d;vE6Q5F[LEbUG#pD1"O"i'"*j*w5vkmet$47^C׺M9I J]d(S`D4V:mq:H6BRƷ(FDKrzSQ4B#}@3BY`ڐp(j$iT s1m=vhbh `jiRDZw4Zj5Gu)Tۣ_n~" Ŀ[_v\0?^ST<^|W .Eg?x}}g[cMg-t|fGYmX1F*o &ZhfG!R 1-`"mE8Hd2">S?jG];F8h1_J`K' x$ GX:$I)PHabKpIB4%( 6N`=JkX&%Q 9M 6%i" 7^+ycx]~`a Â3Vkg=YR-#^Mnwb1R]tleXW[ 6P4<f$3wm!vL8p!Y ,ǩWLRM1j"=4[o*v)m}? \*^CPA H:(8{boz03w]9nrIU1xwsui}b\1֥?%ч7gYiu_E_\|p'J1>+YC7ѿJX8lyV/_O uy"{V-[X ,]]cXXE9hLgHڭ)FvL; iL6hLqGFh!?L%jy 9(D"t$bX+ SmSJG+k,mDBaލ]TQߵ̳|wmӿk!V\pELXq4L/7Q3Bw\|L’3╗C*"-QMD!I)ū8\AT$RKNgG%26_VQlZM4.uV=J\z8%BF u/N#A1-/3\7szWCQT| M1~? 3<>wA<*Dh&7HIiΏOZzՒ\5xC S5 Ah@Do7hZª"_dYC>ee{ښ`&ݢr{ߢ|1kp[^, }phZ$zR=ꜯ)L GdKsIBO' :sF{m750|nχkAB#"r)姏n80(G Sf2+0Iſ#ES"ʊ`S#BFEsF}:JXTR<.]]bd1}-YKROkG8 bsKsTR WapJRRrt4:bTQJEJ5ٻ|” Zvv[ƳN^AX)=>8&Jyb L ~9v) oxq+ZfsYk= y,癫@'j-EDiA# !G-3g_rDd5\{̃ݗ@[hm=mFM_)n 9xg.^DMxO_J̛Tqg.Vz:Äay6jp9EY0` :gU"d4)~^7uf,f~nՁl"ظЁjqZ"^ͳ3ԈX=Gy8VJV%k(cZ*0JTfrSk3߯^~ F\pV#> 8+B(!INAr;Wo "y\w>?wBsC?_c4 ?՛?b$KXiY{>`ÿN^ILf0UZ OJ0~1K}TH&%ٕ2)eb˩vBwRCGrk rsµx%MDf LKT.iCR<Q@3W)I Xs{ %dE#.;0٭f҂3gG5t]6_L~<>lAy0)@Ԅw18R 觡t(b? A5z#p L}>.?A} PXӑbvEeB^R+)o;'tӕI,074w^z.,_Bsh ZYy{!t'w>/nqqk9C+^*]Im;kh+ҋs7 Pׯr\]ܚv 6WeR4 t_ X\]3ƙ >yuAgVӈM&c^dܕW؇[\*=3n=zzQ[O>$kw[~'v*lcSNaBr(cݫx9ʄ1GZQYANa0T޺9x[}J3crtTG:՟, jL5 E1FBuAhW+*GG]vŴ8պ;N9t-Ďoڥ12d6qLc&$>fS 5 ѥ!Q.J]ym*i)X430VeޕƑ#R;2o2 a{ݝ6;/31>n:jbt=܎'" k`zG)hTF3Yʎ<|8 oSQ5a0uכI>\i"{2a|+UtݤL AEypw븧1{b]opT{_ɧ*|uW2 ԽzjQ](5>),Gw# ,8wi1j%Ͽ.. gDq]ʟ5rg [j_n 7;Xc[f j!`5}yoUTk}n{3EV9\3!xhŁ5Q&>2G[$[gt7|L4-S(Y5YQҷ(jěaDé/y`џjN?Tc( Z/~}9>^.@~gEBW 7|0L &^\ `z0$Xy4'l^>D1{F׆TlݮZYS<ɻyxngWi^wRԆ׼v_|}7deepy}7vɃBUzs$M7i/Ԛ]3h?Xuq'{nK0w_*5BvWll4s;,_h+VKgg(3߮obCE pM-ͪ/Y9(&XT`x@qtQ: JRR"pR \5Ozkv!upqء/gg؞18Lfxv(0S̖N_JD >/AH I ~Yn_3ًV $t2;߲ "y E?KfhI~fpE8 }}rD6b*NDQj m@V{&c[O@s/N@s#}noWXX8]B00E N J4͡}N{|w#S =ۖQ&OI"m`HojKˋ_.Y{Ƙ0ZQ:^{ f^=}׻1gލY^W{=S͖'iNGGgzـFߎap mfaF>ծ}c|@pDa"Dbݴ&rx-*]r {K.#+'I*5 @9w[JxCŗ$?aKn#(i%<Z3#=T9B ce(^sLq ^K7z~6Fj~m#ͤ y(rB8Ɲ107JiBL="*xgAJ" I udzQ*%Ş1=09sg=i9g\MuN]KDS. 4MI~'֜R5Ƕخr{خcۄ 1#-EO xcۀfMȒrZ+G+kGrj!&+"Fԝ C{wj; #XyC ?G{?L|ZTѐE*.Zgڥ|TׇOrx!jUMh̑‡_2,CD ?)xzQOS,x~z`κfx "p-o VFsߖ՘Ps7APz,Hѫ2]rɠÂ@"\ֶ@">챕b{~vv@f8 ?fe؂` a/JOuJlw<}|l:E_qR Pk6RI@A,%%kV2Å ,ckK JVZdj5ri#]͝HKƭ%K/y(g&|-- @HӚ#G5`;̎ O+wћaփ,5FhK<Q-r TҤH) Tny7BZ -%.I`,hZ ÍK4\kF 8M#gZ 3OatAKzD+_0')îR' 9^tD"zIq<[ D2(*  fDtDb zbxuajMD^* %: qpe)[MPXBE9ܦTBfG(@cz`J _-\Å. ڸk@&{o*Jg׮ CV ':JOhj4*Vnȃ1e4#tI.Pf>mӾZF_$;wU4O sX=x~y~9kat7+R E!~meA3'h)&^PZgڤ9(Q;*.d W(Ƣ^ڊ-~>''YϒVbV3wWԂMX Y/^%7;*VnoO҇O?_kn0{x^^^-b;;Apg8#>-}~5?Y'vjĂnl'?N5׷7EFļ>t6C*)_0>nN+{nuqJ 1BnbW&Q̡| >㌎d;&_qV!jw4~7kyoG.7b/q'^y.,eAE%S; 'M~,xW2˪mV1 Э2=暱ƗY cʐ3Nyҙ$0\Ѱr~9=e/(_}x`zx_D|5P)fD7a&nD٩jGgn2JUkQAGFie磳~^2P9)OC-TBM% V zcn,)_ STM]StQAQx,A1~m|áaR!]{~a#x:n,2pٷE3Bҽ2QCv=>SCLd&& ?8`(¢O(L*hD,u@U#7A%W\3p, N1=lA5BKSyf8 h:r\dH/~Q! p+U"(*>]_P,,k\ـ=o|vZSL^&oI7e@MAu gtt;9@Ѥ[L Ɨh5%2=.gs{BCeeVft %`LDb4u^J1H+ bQ7mRݽ{}8ʹDFOTRGқ:yɢ6SZiJ6:ͳzq[l9+b_͡zD (sWyʆΕjc2f'Tu/ OѪdgedݚf 0c'Ԏ,Ԟk3(oW.BP܌ Aw'.`q9sNC4>P]T9Eu1J8X}kk(Y@ ^itIMa 9 V@$Z* (OKjWF2evD+A{\Ps*gV 'x2?Z!Zo0y WL)CE+ ޝ/U*Ame$?!439Sw;ЩjTWyJH *5>EK;P(jGo/Qqj8JSyhFQEGJ?1R'̍FH.=N0ܬf h1DŽ"p[kB[˔%Cx2IEQz Y-n̛AP>2@LQ7 *ާ94T}7Qi*7R(,eb*r]H+[YoUii}fa;kN[Ɇ+RvwPQf/潮Og3\,TnWb~,k2j%ܾWq_mHk9^E_ : g)Cn r C ܽO7ϲ)9?GǦ} ?OYZԒ3ǜv1'R:i@9Y8]?{W۸/n0@.PtmiwDLl˶lS$3y!h+RSQy[$mڈrE *Jzs^'рH}+(мI j!_Tx-`!Iin+=#=˪T%оZV'GmKIW(O/HaoR@}1tb r8̆W$cIJi[Q#BI!ǚ#h\3?"\[S*N5B5InXPT#78 U) Y^]S(221^w&@g{gKL⯃oB?9Dg|?W4!rj--%XZ^jkҗ4k\"^?uUJwCÿB>kOeD}M|+?qC'[&ӌ&?>=y'*|^f_BCX2u21iDeF;#`YXN%ʌ>WqHOWpCxZ3G<xiu&Z*C~\Yn|8*6D0jY|L?~xp^h\a.9X=Ԗ!I|Ė-UZjO:(ƣ1~z:m#3iwfr2@ȅ3QFy -֎v;G_gF&DKG_Q÷ѮϿO&X[w"hUbGogaϯw,.n|]/[ЫpN7 "V\BV)/3bP3[nǂ^%xρv9EGy]o7 *{"1s%4/߻O5 ΨzT%N5Jys͠z|@T ^ppe)R5"ilڜR by'\dD>w >S՚y n %uI.AkM5 {ӏz> 6m?n! y$uiྨ|+Ьt,Ii-NJ=x7˓YX7.i8~8:4fq ݮM?\\}W#sxUHuvxvBaCRgyP@ɘpJ-sJ)D'BbP(vՂ@kv2('0vK)GLfsBAvQJ{/ϵY {rjF/e[>=N'CыQo޻(c{(Ϧa(~# } Rj>bH}IF KLiSpҰO߲πj>H?f}p g5{CYzV 冸rC۔nJ[eإ8鞣S@cKgѪ;'ڌTk1ɁKvZ0A<sBGyy09:l -8 <+hRQok 6R6ܫ֑^R9V;gR.{W?RqamX9yL ((}SuׯCDwn׬fK^^ Ϟ'x۰Ϸ9|BbM%FӏCŶR1<4:<tُoɴ7Rz&޽Zv7w'ͽyuWxlme+Nj[3Ja/̻_3=wє>dgûQi[*1]'u&Sd3XDy]4OJXIv B۶^*ӷ ֵ^/X:&TLW:U<OƣDVA@v<脔V14v7l $-nӷtWJۗaj$bI[&vkM%Iջ;[wohSCGMhf庭mfy>~$[2ƣt}6M|EIy𡵋TPlED}~)&gXu4Ҝ.OIsHkeOm?){3z h~c[UMt$ۓ4gDʎnLr";Ƕ._l8 TeͣWwegI_/e]ΤR%GgVE5Go>:|xҎ~B e;NE-v;~3#o׍TP߄{Â1uWt:@UߤR"vR\'2ѝDQYRqH|*YM)P+ )AJBrj(ݎ66w0Y;BpMHg\pv%1R 5hmzjOQlyƗ@޴: 5B]LIɁ)QlPwH> CGSqʜp8Hjg2C:Ք4<¬};U"4 5Ĥ qY.Ъ#/x9/b؛+&)w}P̉ ) էՒτr勱/yW ioޔ.wpt 6%! kl#5!y}s/D[Polb,; 86L9̯GȎ {P`5x>4%5"}Jy* M5C:g<#\dz? ʼnEZd5t^謪kF5n|v~8?>xrnDӧFG8/ [2(Jusz+%WdIe~+{42@ȅ 󵗥'QƉS1c42s$ԻImrjbpZhB1X)I˙LeT(a7L)r,$hcY ZQ*Fj j@F(BJBQ+|+;3H}`QV=8Z9~tGy>Keri؃[Z}ëeb>WBG]YSC$Aa^GL dNbXy3z 9W=Bxu8db{zT5 =^bEjFB}纏Hc"[k,XTZ[^TZx[壛WgۗO=`DJ엝s4{uXMjTӺCA>ot򶥿QB)uq#;e_?xMZ䣛Y>,}uN:csZ%,Й b %F9H%[\hnHPq{= +rQ˕K`(90,xl.LEVQZ-7` IZ` &IĮ_+0íV kekGˈIMMhJ!)دA] ÖIʨڿ**rkRμ[0R(a(LNΌC$mN8 hm1 [u* @W?a׽=L -cwK >w;89-kݒݚ!o&)En`Gf魫 -z55D?uWn톑րqzEI0.$C22*dvkQ!9;"i/לQ I}Wjj E[V@~B>@lb5̦BoxEcсk' 3xžϐ@}\ 32ibΚ<(l^S/g:Co\E1h 3O~Ӗg| Tu2Kǎbfg"eW_(A}W>%N4}cxpA ax,nHF;nt'OxZ(&z!%zK։~׼ -ԣCϫљ_ŋ “_Ϩ:'v2/#  F)/~ն{>0@*ut978Uaϓ4p)닰۝"sH*(ٶyR8|^@8ثM#oU.hSgZ }33?QAQxg'sKwb1TڦLi" ۈ@IGB1vW{ARpfaxU"ZRHi Q"p{2"Xf $yȘ@ZmDN%Yk%N~T,Bj<#c*M~̜ɏ ԯSAH 㧃5Yzb>se }4A8 ~'2`gj{Ƒ_i6Y|sXNp30x>d'~f&EJb3H"*dSڍ:9bU|aA6 ҈' BT8xSuI~}7UD1;;Z_+Xztʮ?JdBRNiٗPI.$t'._  eV!E Unne󢺾nf':0/͜] pQC!b+(.`o HBh(966&00I=ˋb Y˷"MNGך`aM|r3R3PxGO'a2=l իzrצnq7]ӐV!^H]Y@eg u캽zgQ]iSNQJgT+EVcn*TИc? Jaea  :RY1 u hW^d-FSVKƕo0-cɼWWd>eצ~ZmĹU!@tVCaU5X)L/-jI (ʤyIQQ5\ivN;|H[4f-^a}u,Ρf:qxL)I!~ > s?rHZt0Jфf:L̿^ lrŷ0vQpZ0iwkuD 4"y co.]f:0B0K40RT27}qR8#}%r& }PN7;bw0;.v9N[a( T0TC1y]0EI2&&ŕDQ*>Mt5hOPk ō }b)}@X!GJ CZE]oƾyh1z޺z OzQ|҄RR L֫z~}7L騂 wd6?yI;l7dԓs|$0=DxFAKXoisyLގ٥vL?Uy yå~l!Koߝs ㌠TuǫhL^ /7uWR`Z2càV/60cb-XvG~ot YYFv{o| \U U*(/.A[0V1 ~Ud묠Y݄̬p|,smzىɸۧǻ'TO] #::Ϡ\JND?(eE!'<\CP5K!il-> Fd/LQ Z; l_AP 4pxP3;m:ˮZUnD(Gɂ@ MW| NXdCN!! b+i )n< :3W/ ܦ%Ů ,X<4* 8LJeH@>&jKF=\6ۭ#/p̆'CAmmN =DТgkK{(*QUck;QIs;GڈH]5aqЀS2x kY ǮPP%T<drSRW󬬋yU TBU$/Ê]пNK'z#R0rTFj唴`7._Wmt<#PGI1uQ/kB`~ 2Ý< @I.-ٺJ'ky6dh`f@aQ; ԍF -`z6gu^u^u^u^h0׬gDʊbB6SRbD2UA p38ф>(&ὍZp8}C]@I@|rs\h_BպsxǔLUg^ݙ)::`[2t0@qK`(0[ u{=#swQ?tCCF[W;WFFQB:BJ?^_y;>٠wED({OAvb AQJ^҉&H'5y N!gd qy !j<0Tژd5ʹ@1QB3%UxYnTMCZN 繬e^˺8 + ͥ*ˊRIY('ȋ!7@jS3NS*e*[& wj G-J-sn߯[psS(ĐUeilR H<+K,Ђyā$1/8bKY4[ k5PlRi4w,{dm7NnA:Fԉ{T W)H 8HoWS5Pآ9c daE3>tDpYA0q#hf%̻|$,CH %] 9lb€MhsE|vOPBU!\Ui`.yͫQU%rK|srr5/o^S_; "Z޴TZNhiHqxHs|OC+һRş"# = qV6v >^5X{Ljq:WCP*I; x7itsD7Q[hN|Hg%tm@--P,6#-"w*D2ſ׀"P42ǩlbOX#Av| |yF[27$2]T$Q|KQcz.g0&aL˜m6+ Z ^ 1N2T CW)9e,8%#Jr}Pv\ C);h0\RPAΒؕ&bIDWD-XjmiCʳ+iM,ǜ$ʑ%"""Ap,X%fpK`?_5ua+]gE5ӏ.^Kag̭Ripq;Џ86-t].:#h2g:R(DM\d DF3FK&X*kc>h0~Ph1J{˴\ht*b׿LYZBMjUcQ 4T8ghvC Bq`ZYV*FTPeBX"qTUJHۖ_zChZ!>% k ?&ѝ]`?DA:c?1EDmaE#b"K!ҩXO/c XBߓX1pH A ~шAƒ9oBb`}ꏶZRFi-~ɮB!)]'ˡ6J/w![PI1 . xT0FS*N*7! k]  MV@07 !S^l2_E_Y Á*6'TT>Z׷1af_˶tut$zCɺ9C:1gz|WQl/7G ɦnEٍgՃFvP'>ҵzݝ+o͚}w O- ~?'O}u67~/6gQSB/QX~fKZz}4?%y#g' l B+$@.^bɖYs(}B8,&UaB&Hjr@Y'>P)-G//zKhPȺggs`*k;O($䊦$oRۿ&El)- K&OU_"5)j̽[|"ѺU{iK"Vi+zoc?mi:lj~ͻ 蝹yy[=4U^=<&snt*7$(͹xsgJ[TcvS_s<<<{zrY=Zl##Y7 -i@KV0n`_?|yg.H sQź&})78E1,䍛hMqO ,>ǻMjrK-[xEwa!oDClw?vMB1p1gx)w8&j4^bX76E0p`1E`GD;vqdP.҆|>b]Eb}:ˏM,@tbÇd:ߑnP=ε~t8nq^BVUӐk)̓=<\]|mةЮ$fL QxGTN?ޜM PN"]Z{VB3T8Igl<)0xXQ H>("͔ @U*{_f*A?ɰh?5v67)|t'?gHI0 ,}*_R&0hO{ S攒r1 @rf 撥f snӣ/$MNڜp`<7i9=ʊG|;r{RH]!6܏w|[&Pߣ ynα nFVXb]! a1o>4Jc 彀՟m H 1~$-8ڏ`bzU`4:F8mL r3 86ow "Qd8KBQTƙi|f&iuB{#™hL̸?~OO SRL,_9)m cZBqTE i!HD>W_SW3*{D^ւrQkHH=^z#FEUAF>S&< :YkmV/ۻ o-P EӢ_N`pInjNswC2WR4q"g3(wEd;5e =Gnpy'~i# t@TB$^klF^D, $; , 5"K 2xS U!4'Ut1ǛPm8  p` TsZ`ϦRl:GZ\Jh1*7%k}!!;.˔T{W듉{*UxE&$ cY g Q /TI([T*`JXV`ZgM:Z1DNSFXh1@)*Qq9pvuBh@ y$syP< e&Ϗ <-j 5v9p3 yDyN9*udN\RGtJrJi-C*Ulh NU,sC|{>:4R}]WvX!a:Vh]1 (+L\> (Tb@00+[uX!)+aBWYVTz mmehaUb#`ZPZY;+hY~h{HLzxluJ0KjGlVI#'k\b4Ao{jXڤKPRq;"$\'*y .GfOs?,}~x#cͥцJ{P̈y~ 9QȪE4#k/p^Og O7gg^S\÷GGtb"bZtb/1Zl<96*"s@盋 +K1V52i`ǤArPtPM@@HH1Ӣzuqӝ%s%ҥQIK{Y"]=_M!54 AfKMxy:VDc: 9t")R\d3%=:'t:hI\kȎr 0v$#bZ}2\:­7׫U/;&8l GlkS \i&&ك܌Y mBˌ#P3Í1C5Sc!wVAS+=Օltcmڋ^B n>\Q-|D~=uϨޚº{-Q ;[o3T׫44⎩܈ǝMAZ@lh0-6(n}b2޻WzۈX!Tgľ KZ_סYiPr٭ R{Jig&"Rfu2*uC8~S~}lm~^$,c+Cނ+eȭۯ՜6H/Z \.l׊6߈m'f 9..שׂ8(Ģ&c-.>"'J!\W\M|}_. <qq-q#^\Q)eHxz6q,Y(;d󝼝Dy;v*o2sJXyU: F PmQx<7|*Q?$c}Y@vtѢ'I(X-&O#\ B\| KgFQmqM)Nso8Azʥ* ł$ask"QD<0Q[[ӥ 7.TWE]{THKԖu!d+R%F/ETHO@C굴w;/>lΈԳL>6FRJoXuA Sʌ= 0c׻T'3TtmGݞ|ylr#٣!3eǗns#|[M5Uz  ӌ,x/@'qrMm"CrLd+ma^[$~T1A:wOQ\=ߴ5!so=þp˝YÇsB-%1mfj.6^Vh>FG&s,3@mAߛ<07׍C2,l><ɭh캠{.SڀPI;#Եrn$hv(^c.KéܜߧWWZKEfW9BsȖqVl 2Ӌ/SzJyŊ"?qntf >#ӯbMq3JZ΋3l9@˾{Zݠ[Z)Zu][`zCɴd %XĖAlqЎ[3sQ1nz 9~BTkN#RW h`x^*F:.3h-ԁvmpW#w|AR׷ & "VBdT@0JnH•;X_b}]ﴇOق L01\f:y{vƏB9 ?f圣iLu2;M|זc2S)PDvKϡӒ1n"Om3;g_7gK~_Z?->[LéliwƠ1n8lMPH$y uwzBO]I =ޅV5X IRsJA0ya|T2DAAđUNHC5|m)L-ϤmSobqlcd?hFd/A V=O?ť^o81诃}88Lw? ~wEin''<1{ͽ5 3n RP3{xq1n1V)5)X1ꏀjk E. A"Vzϙh!`ʼnTq.ĕ+q?#p'WL.&S}8JJLFBg+r##e-4x1|wOYbS㟲jS |<* *PNjMA,Bh[hJ`E 07oY۶4 |Bp;1p))Ksɩh["* fUԢ#n$/thZ A%Q&E$HDZ=, Z5[B dP P/Y˘@ h:4J2Z &^U럔I pNf6̓q^>ŗ&鑿.]|ӏ8*ɒ*wTCGLN{~*3Yw~Hق?ݜ!IhB{Da3;}0Rs盋 |>mFD>|{t~rKl74*ydAq|>(%<=Da8Bo4R%F"TUӯ6Exw"?'x"?'ꉼEHpӆDEQ`:OFBDi$jkuPxEtFQM'_[W|.$Q'p8߽ͮC~w#?g=~<=^7 ,w3RgE_oG @VLjv~DOJ<edXSk<ԙ95 ؒӠ+^yAazscIB/g~168yB͘$e{pV)jH E& =_UW]]1Ts.4Q/,s^OM$')0GœB@10 / qbaT7""M8[adOaPApyBS99C0]q %ATCJC$0 ;.K_PϖPfKO.rNx6Ys<0\FDʂ <*< DV !0i*cMZ1%V&DƲOzЉN)҈QRo;$AkgKBjLvV84! ֔Ma m )C\^yo&4^ ’v,Y QPB1.:ׅI LR4^3!H.rA# !p-@g@# ѻi prv21p / et3:6^3j]pnl騸'E_ߕ]}I3F;%a$1 shX6Gr +7R ԗ'" ΐ2մ,.ƘO0氖@EJpAǀb81= }xH2,JUVEa"@l-!e CTtVBV+jK`VzֶВz &^CDLqm'X2|==(Ԙ%$BqEhUڎUkj[ Z-MEۚ74J@@di5{7eF #5$LlioH2ިIpihmow5  Tio!`H8|gB"ӑV *塵"ߓPn(wZ8V0ia%煔 ȑ1\mi q wq.ʃ0n5 E T@KI`'`9ᐖ@C ,.v|ޤ1]<}t֋q|. 5F #YR~/k|*&'Avaךk.)5IS1 ϸ> 2#љ/R= å@}΁stR}C.en` t3- ۡ%˶:p-!5oFL[>ؒD@J^$q%uA~U^Q]G WDj:e4.u*&;B<3ٕھvOon Ô!y6ːZ2;07aTbp}?74f-Ɣ˫pu_uwgŶbHetZaz9z'R-~jYVvM7]J~JSyTty2g!\ ˋ[T,M*4&!~9{_L^~ Ղ8 HE;Hk0h6. WnM}Q٠IFB1EVXdAR3 n,ٵIckg&<ޚ9iw:?:o5W>(ܦl{vj+t8LpCw8CzoCWj.#q9v {їU?M{X!4{.ABڮ|Lu<מ"9aw!Jd>}{@i[\ ,:?.\65^FM  _A⢷eI?hPZd)_z8\Xa62ILi#2m_V7-pXhpVF"$ Fq`&/1b&)O_#Ibƙd[ 'bF>Q3hI:l]ʱ 5nnm;a &j?)9fQ1 <*0NH$jM)fj |Ǖ:x$@3D2໰OwTkVߨ@"+a,76OfpQ|+>7~ 3e؈(6Q g-‚3&fDYBSSd7Cdg.LTx0=+H K02CҦE <`Fwz>#Wr}eX8e 0K+$=\3C#TDhcT`€SBk):^fC!3_&|8e̗U\4Pc0J sֆ3q%AsD!;,<嵣"S&EY&[Rjtԫ7_}_|(+,ua"LS,0#_LG#BC 48r+Ԯʆgˋtf6X\yqHjOKY+IùL?C"Xw`4YHh0lR(A9&]!nU8KcTy N@*X&``E%g0iEԐt`0$}" tY&``U`NjPW]uG ?YA@ ZՈ@u֣5}3e,!8XTTC1* }r"q ^W*v)ӷ>ťWSed}O [.]-ņc*yK2r \}y})H^f?@Dիp.|LG|ɯݖouIR;xs˿cB nS<56)nw{3 a4 K^ t;@S_i-2D @ +6ntm]i:ag{ζx$Uc]obZkd).(~֐|0DUL :5R.Be؋| J[7kDqCa vKX?]V"?9Cnns*hVbBʃߟF|0bag c³RizǞbVXbT Uy0 DgK--s8* G'=Fv%eYe8T`|2018q q5Lus7y˱?vF ]d{dblg On9.4T\[xH[!sRnD!ZL{Ꙍ93 *D+T⾴`4I_~Dm6ٽwU (( EQ,X0ym@E{Dc2N(uDȕ @a! 6ltFȫhIQ; ^r%16aold16PZJ4WR0.;Evprt ;@vT9=TarXUvu a>evIS'`ϛx wMZd7{ nFiopoi(>& 9A~e(bb;6Od}ͪwb)**_\8zҰ t0!N!ps ^ts0_'z0]Mp MOKw,ndS]b ƷGgjPnh퇑Ōo5DüN}~tݽ!=ڱ(vk㱣km<']ԻZD4#{" 2ϒ'uBd0I& :+fߙ'oH Nό|蝛 fEZ'f#ƓGb ߍg oq|]J4ǝ#,Zt3O=~otf"Y)G]`qUr)xi;dXD}"PDc4fN2VbѪE6VzOav;/nMo7`H ^<RŸHPRw?tBm o??^Ga2l@hՌr?7t,kyX?IiNv哀:[q޵m$"e<X`wc%nm% Tș&( օ"GBÛS=łf)vD ]TRD2A4A$3B`c!ٕoEv_Y{4afn3?D0'OeBg ta,4pBbydb"q]Od.0]/ Km_p> O 2i==&EIC2vW#lui5a*|hnihoEs3<}`5@<"-DDDGcү_ָTK@NAkM'|f$1$gc5"Uؤ8d|Dj1 P1 qM|an*U`d) a$JDjԑRNձh.h?R18aւp[9H2`c.-QNҟ捄tqi`Es6Tw]sjH5myAG/K̰#.zB`~KDƙw =|Yw3F T# 1b*1 ӫEkNA;jhNh #!.wR{췝~:w.}ca͈"MPI[1}1`MdiAq lmN6(dtlЀ=k0YОj1գKzqp〄n$uсTk9u3Bb-5n} %NqhIu:~ļA#y̟Tx1H?ª=T#JôLk83m -D=Q(!ӑ'!"3$fJX&vT*"'dV8MUN`ga?L Q}jV ,LS 7Z'DY%Wq<(y4NtWO; X_:H`GgA@e\̀9Q< O:KI en(k@9CfRxt3v5&`Ziri5 vVbo볷Q)g ǜȗt]V嬂HGI6Uᰎ#3S&#fIJ[F-v?~ܤn_q#R]9դB2ycKxUP毟yT˾MٖKլDmd%CԇB1T2Nz|= R]F3SF3oպԇa,(25?PVӳ2?{``Q)S]J3䗥WaԾxdu k1fȕ#)Ab)C)iS]3D?n1~)f+7"5[iRf;RNa,JɚB%{U59q= t(yb%#i iBȀJfKj9kNayIT[_ G}5-=@kHEA 䫊pP6XxBEAW' IdZ7,t*J_NW}%2oq &cʼR \J]ñTRqqڰCصafFbF1BSa*=r{]yO螃Oz"?\T{Z_PG"7a%/ |VX_r|BzS!ؼ/E BEǞUtJW[UDh:^W\^1FڤxXݦ @SE{wAfYt3RpT2{^,@,0]K9Yp[<ዺy▛L3.Ʉ@)c#)PZSBK)K%M Pj`\< P&^bL WYmpDMGEXS"jE.16ZFaUŃn &my+ w?tf5y k}-:8ﴷR+ۃÒ"[QFr ߾57&rxxiiJ Z[GJJN3LQ]x3O { ۭGB 2L&e$P Viv pGP%JXu]5K9B!nzBbj@@fO$=^ ׫kgjq#]PYP*a5,vj֍?/Hy#}˥--v%03t E݋ɿoW- 7WH çݓ1yj7o}glS^[G|MxKyryhXm6@߃#wCP8Q~]ē )Dɻܲh%ԑv;xZ*[E4JJ.l)b%Q)sg;L\DY@5} *T:[V` v7t9zǚG7"YnmSmה]enƯ\vq1w}nط.Y},^^ӗG}rgOcX׃ې}l!w`Rk@s_I.yf'jn,u˕ˀofa/e\KqYX,3SBJ_5Z0b+LTRMwM&0%"цK;ɏ}+[؋}ЫCl/_in?5pnZlJ#8wFS,UBM̵muǛ_o|MB5( aOAcC(/ D\kD&*YS2$%wҀ5T}/t ZƄbcp>,,Y# 58e4jЧ.YS"w*eX8qkqANqBtԦ&H(5hZXҚ\1ӲxH>2\"@[.\(xn*27ukW BfcuZ/K;a5VrIKN.>'F]1CI2/PF!, ;$BfMkknWŕTQƵj%'{vjRgN@Xrf2߷!%6(8dFlrp.nc^>9:!&0ZwPxVOU_Wf1*j3f$V]6\:"df V3 nz nsy2h8!9EG.(:\_F#xq@ Q6tܢV;O` Q%>FRlrZʔvyNʄH3F` ՟.6 gLr̄ apzRkD`hrz¯ܤSS̕㧘Fd0R~u&tOSLNVcn FT,H}!#Dw"mshhHpGؓ8$n ʀ~r:x$c[]N7Mn$bI.psD\8_#yXplYRaU\X057 *BiLd pլW3V0NBV{Qw.ru&) [))ɘbqJ* Mp:\+*Z3Ν1b9PͶ;\FIXgsԺ<> O$F54a~8|؇&zM0b< 5 -n>QLOdcċcN$œR{ F-gog✕pNΫp .zf P 3@9PV6TA%" 㬥ֲy*ح\zhhS`۞DUt!ȒWPRPdo=PRt ϵN%Ukq%c~)Z.8d7<_qTg~'\|~ߠVѕy2=2ߠZX1WGySd"I9[]O\=mԴD F1LHC5n#k(syVkǽJ2iI\λH kz{rbhRSeՌ9GtX!֗e%AiiTYR#Da-ue:$U6Y5p nowߋ#E=p)G/oRQz<܃+і.d0]``! PyXbRQVL5D2㲽%!d@ ̯+N֜Ai'Z. LzՇZM\ϯrb8 + j9/5fm-mDZS #ųfT)ՁHbOVA3*I ŘthvEe%ēPn.ֲKw_ey~ymRe ^ |ZYsOCY#W$RV$ֱ8X$ؑ*fMUy+KKUⓍ:aMզ$t+5raA}0Z8BQi&xgAa;!Q`RT7J/ O@GT\Q%x)X*CFBܓ_W,5%aJ*x?"NL)Qe뫲TyQO1PxibU%`9X*f) # ğ RUJkTi23 ek%n{ h*#%n+>ǚ,7AY7?VDZᦊkŤVq19lMf!n3 ^0bNIF0F] fcMA( D3H3Su%3!"{#ͅWo]wƻIdբN5gv3 QNel\~sSqݧGLK`apY=~X ?\-V@=X,)fo^+%c{0o(F -Yt%x3z[ n/m8І*?D^M՝o QZ:N52+5D=> Q=)0QO~HӇ&zM)!гLҐ&z 37Hj&txwֹV[Jjm>fذfsBiZE&J[dݻK#c}P^]b5#J=mu2 WSjӖY[DzB (t^3};ѪKw{> AȞ_BߧIJiyƉG[?:bq5/9[~ZbF죾P gr:xי:oU+:Q+zFb849H!X6z2WO9Qtf?dEp9.~䈶 ԵD`?(BsC҂ zQ'Ao$BN:4hƻt6_+D M$2{|whZzv<7\UݧLf|]`10K*`,(q+jDAY.{*pJрPD еhLo A+u!X x >YT6g'8Ӛޚ uQ [[>]ޡv:`zq63Dᗠeuk`lix'h({ވ؜Ȗb^7"&%SoaŤxnnR&-o^ WLF]2Ԓ#{*Fnv[P S\~ۍB ?+k_ ĞPΈUk=ziT_NOAw A +^!Q[o`'hyc <4Qyh{(mFTT~{l4[/~?՞ﵽKSSN]I>Wy yٳ~vo~Qp נme=' c2_+Ki+-K Ni-:L# Ubu҃#-zAa߂ܵrexڐg]a7O3oa0yu^3 okm FJL|ۍH׆Ui 3}[tE%䵖%ߟAV1벮|ĺlijt6tǢ~TQj?U-J8(5H꭯D_jM"hSꙋmx>-_'ߟ}}]>ٺ bWZwZ\TU6mE`p7̯̈́`vP$woi22U0BmtI$ASL 2pɹtF/XiL7ɁT3 {t6sS\t{"='t_^^c|q_M,P۵?붙>эtڳ/ |eD}F+*,UPP6E.Ӿ*^hߌ0ęTKckYveS;'8#cGȒ2~/_X]C7LJФn:H( 6:_<&A\h ,%ӷ |ˁ@3m'n'A"M|ς @m_9PII'uց83yJ| JO {ZX֎UfgTu8 kyw7?G?_~^8DLG1G*#S~śu%VAu,G<Ԫɏ<)tC7MTf*ko2Ľ[oqCLC,]%>-3p{}v[bwnŖxP?{7@d<4>Lo?5 NM;e'bUTDIJIbG!y d9p4[׍tEb\<7"|3 ,$D&HWn %Р \jẔӳ­Myp{LI9 ܊I&B=-ӰY熽RluזI7=} 4e c=kmK'jח(h mbHz,>VqՒ=5a3hWl'8uXm`EJ]F- 2Ks{XfK[kM8v 0w+$FMEZ!X;԰DфË:U~ѝvh,(m@:LQpEWHW*(!zUHD2\t+UNPB20/R<{mI\Űܒ(hU(qQ%kJ#wPnԀOyz;[#/a;yΉ/?ҏ$ J~~]9}RT"I|y_wv^l y׉DΒpNP$_K'7~[1Rw~9q":.#ldj/$ﮝJ.n>U^Rdç^ԯc\XR3^^atOjOIW/)I:@fU~9݅r3RqJms#:i 8OV?_y`4<|2aW;dni`*3Ĕ#P50'Qw:CT_1 0j]vO|c}3 %;qHI*|#*|5zr61TU;o^`wd:'5Ya+7 TLH(t*x˳QB=Oim^F{{F9`!#u X`^,զgJ ܦ~V V*cP^YUwu5ߠ = ?[>8I~dx] u3^'yt[((M鑡a DBsD 20cb2G/}>Vlޠ{s*owJ-_NcU!} plF塚ᓫM"ԅ͞nZIv J d؅;9&\3q_Bl#|PP&8JեЀdx n ;EYnQ-,>h@ *.E՞9&\UK%D&TRxwUPNU:bH`R$qQx=RRu=j2$j읖$*27rI"L:R!4r]_R'P#5)`ܟ@k;"tEYwI$/Pd ) T#/Mh6h; VVyV9֓u㘐V(trV " K|]$0A)xҝfOnn~a*[n>flZnibSR ink`(;F"F[8 hK_KnTZjYT#XFꑪ]( ጦ^.zZ|;PU6Nؤ^ګ^{k; b4-=BZQ)b=d)7X# VJȜ(C9yUcpdIoq$pbZX![@0 {i8N{s9L"Є=zDA{*t%/uB[LL pMIuZ Oy\c0ws-2Al~6%$n -ɄkgXX*oO/NW)ɦG,&1e3B08⛉=m+9mQ4  Mǐ-uZheZKdيanzOWpjI'fr|B6Z/x[^DFrkwo J1t;T1&H~X`ć.<"r EمX'mM(b:A [UhEY8JK6yeR#&5g7.BJ]i Wa2l\ QTcUD0Vje ٲ`ߣyv@y×oA\QJ'P Sz]r%׏G%,d?7 XVxw摒´D@7+ .~W~q|Xs@WS\T/zSZ/Y,?v ?܄"'HШo)c86 V?&?tcCI2L\*z{e ZfT8 CW) Ǧ*%@u MAs:gdG(tX_!e`߲_׃bo"]ç,0^ x0͍57JyBKsc/!6UWrROZCRK0 wF%LsIt{g,pib|m|PS!r'Jȝ栈Iʧ:Bf2.Q, yTžHyjUݺ.Sdpº0͢z4INSR0&xAxf\2|mY{~}u*9sN L.Å#,H-T#\XJȠ(6$b~m*49I{JT.h1TV24^c[__x*z,Exxwk[]~{W6J,V1P'ˠѓewj{ػ)ˡgAI@^ r Dƕ_3 5/eF ׅ˓6Z1P.XkgVV {caT<S71Vßi0=Is:3&V.m>MO'A34I<$\D Rq7l="%FRV+B3tCl8FWYF@՟c4娯)̋c'5  LvQEguam=WKiE) aUzzE4PIQJ a] 7+ޯb/j^ =;H!:+_?(v+*s&WǎcVDe!* sR(A?@KF<("J-n0aII)P+NHfl)Sr!<BҶ{:Pn 4JzIp"oLS7<-1e%,5އLi @y؄Ԁb ^bH$D>¢ Z BˁLS,@%wal YeQx7a-1 J_ٙ^aT Uq|9H0Bg(pBsk a :N5Ap0NŸR 9@K$%."7"AGM]M42))YKޕDuIRP4,t!1Lnɳ9|\"03%SspTIĜNv[t#7H3>rlclݜ͞Z|G.ƌY!r DZj׭jKU[)U;bT\@ncƐK\Kc:Z7iJڹ4͹Cu NbL'FOf2 Q@{$`laM)8r^,|wAetsNLр3Ż񺱬G'^ fujC ,^xY0X56& Hd=n/:C0%%)!ӒAb 9+ٲݒZe Vu~!8 8R篑A׷p;pܦRj5]|82Xy0mhQ:R'fF"-L*T(P30I}Zeig<I-!,2+hUX@zʒB iU 'd|! gcJ+-Y )MKT SzT` DxZHꓞ-kZ#m:ACm!3dhuwu $wJj@Ip-.g[u+ *CӮY;K& L6rK&QW t:j&Kx-m +E\Gn)_ӆ;:^J敪f=ھbG {1$5p3LdQDߕ{nc֐K:Uv8G /pqD0 馧bhge}#^::;xzEC<Ɂ}ѓ_*/$bg[9V`_rԼX+oJ ׁ[֋̀[Go+Nη &g͉yV5 4LB#O;\1EHA0_iE >G6Dl-7^)@1`%UbX/旤rΠ}/@+_2!yetx?J#_ Wr4ȯ" CKLDVv6,;bua4=&ZovR7y;)SUWdZRT``%ot .A:}!A2H`&-z94gǔbMIC]RRJ1ȴK1AZRy<};5r8c)t*uDz|YP)*Kk1ƯQrt Q{3C1' &(\J4Znב )&GH$]L$8xaS>jhەqYXcs@ =wVRYM!Nc+YSSP,uA`Yj*==T4N(-' DB6p]`VӬZ}r5(fPEYqjSm R-TmIU"HdIL'R'23X0SJl-AÃ'N mO,Z|R.}߽?~'LFυ!WZ N֟#'EdI^=Ŋ +=+YX %lKF9}]nlcApޫW+=CDw``׎Fq~놣*VͲ%wfy}yvgS{<^en4y%yۓO<9O"lj $2I!B]V7Qf@LjSs_f8C9)?lyy黟L˻{ry8?hB %%?E- [k5V7 @|/0< ZwwЋ^W wj_ mbu!u^6K$ - dVkLpCD蘒TKf5 R 0X-PӆhнWG6oQKy[7A=/io{쎟Dl 10hTːY|YYL2k,sR)-ɰ MD\gd>XqaxoAP2d*0l@F+5AXrpnPlI4bDoo\Mj96K<~0V׭G~u}=(iIrdx Vzw;ď~N4b=o"Q(xgFr lz >E]t33_nwnts#Ɣ3G. *EAҜ>1Gt wevK%G_Tڡ$L&p1Uz{=IgLJF|D-@m|Itc.INޤOw&}](w0 p#(6%B|0DlQUmZo龅$K0Zp޵5.$=E8~$EX;LF[EjpTpOYGmG$ȮGYEf,<*qO8Z[G+N8z") w{U:Gh$J }¯!AZSg]) ~V)-=9%,ōC2F9g7JeV g3LDǭVFWr<\dz=7N=RADI809€rC1)4B$bRVx+%/Syk wRs`7 DؾKLhj8wʰp@(ieɶ.H_LaYdPcfiOES$.cELn3 2}ZߟV8.43g)<6 h0e5^HŘ:y*5d(%^d PM[Uūi";.ulc%mJs SE綦4l!}rX6[qXTt|{Ea 5V*/2XI֌+3{9i(5 K+ / ]^ͶзO, ( WaMx&?ȳrWQr?E^ k5/>@/TveG[#~~9BXSûb9|E\L^l,o7iFqȩ 3 q|T7#1_Wf=h\pjJLD">PwOJVct91-Z(㐕/u%L3ze1r:ZDQ@3O2t bLвo@`!;e|%ӏ3-ŘQ,33v1 ?d kꢌDO(RcgTK b{C3ZǫB%32QQ"Y?2Ր5xœS룞_i.. Pv;>{lcP._݇h~ŋ0B[X7cjܝŒSYvQ#xDmûĢexz4w] |2Mv[t} {HAcqg>a>3714x A $݃Qջ 2Fur A,S+6xU?Q?d1N+/>.cА4|g1]DX8uf6^}sF1cA{1eR-0Дwpp{mGw0~_jطwXuYhZ77>\}T2oGN6kfв]%{ge O1ʫ3Nf,Ds5 РJxusD>;z<%)mڵ_0E9*DV-m>tcINS cIMDu#~JKX)9Jm"2g<̗[v)$66^=,բAbB"So?&Tpq$AZùDskM Fj /Bq"d"# Y)@냈1j4ž ˷!P|>y֫EhRiZGx(,=saZ젃=u2_ξ<(r6EDa3Ѓ1ZTj59*FN "P+A\M;_YB!;zZ?SVa-!Cͺk=:LPaMd:-,;h Z;Yуes(_ޚŹD\<B SkgY~yIr8[^m~~엋\vTPfHhcLe=4B9%hKIH(Ggt l{9 >]ʹ".q&"1nRnBԶ8ON l}m7G<%nea2[e#\.[[_iYEi%^_vZ(ީn֩>j0acWuxy 1wS\ JH$#LX`4@! ksZs'5tB {H'd[@[UCTW]a "Wyily0h@sYgٽ1.yƐ!H ЅAаn{[2Dxko bM-%:FxX] P?Hl9\j0"ΕXj `%讫d-GrWIߦ"ŝw@!ȎzmzU"R 9w=aV!2a/>Cv. CBޭzWwNdmN(Γj8*ΓUH4IBIzQ!yQ2h+B'+iaC,$5?58(FckidjZֈ &q2*6!5($8dt(NY#('+ў̚PPdOLPxNVДNUe{J>!(6 p0W萐mt{Q||1NÏ۱Co0P.C=&%U>(wXP"1 T{cP1V|#c 4Xc1V.;j4nB`{qfA;Ÿs,s»)Lh!pnG]{*O0zLc)s衜Ÿhq[ad90A8>toحɩ'B@8P*c.mR LIYI$ кM `]5jkR $f'[ʳEOꋙ+9J>?|}i&82dߩJDbS6ԻKWjqX 8%2e/nt7_v#p>YոÄpDF2H Έw]rެc7kkq>"AN{rPxwT:EőFgƒ$=Y΄WZuHFU̹yoFŃލAށ'Wo_P/D3Ε^ʉզ֏gF G:=(YxnThEn  SQYʒw=ʕ҅_'6etkTݤ۝$c? N\.=p˟AͧO`$23a5%Ԅ3.PHB m 6 4j,(t<&7x;靺͓z7Ik{Qyoxo왓=ty?Ƅc*d UgPl 1|W :(3ꊑ?ՃsZ(\) @!vޥ]>ޥ]~}̟ J&ΠÀpi\&nڠ3E8ʄL$x mF'C>|O;K "S~;CY)^]}f:s0UrX쬖Y)K &0Cad HXlQn%t4Qbma=(DIEZˍfH s7\_4*SWATǶ2ge.l1S-qZ&3@j\:]O'z+ꀻ8_p31>A7߽~||གྷ/䇩5EF_ۭ\zxp+F <ċ6w6H}E49r۷'K 7d+GnP,$$Dt_sK'1l'Y M:)X+ R C nY82&ͯ:V$2`pp]BiL6lޮ4l)ffHGs}q~ݧu|~i57>^F|}B+U&3eLQD$*LQ995 td#fzX[:IWs}|^-_K6NmV'̨$}dz PxYV.*%Lc' 3 &'#Y3i:x3r 89'zzwu+fc0\%Hjr@#ylPXILm r?9=2ۧlhF8 @jR"U&@$ÀwX@ڢSG&aJ[Dؙt\uN7E(7j 8$jtO}r}guݢ9jdy-Sr9QFQQG{ We\a7ҋg?Q,B-?ڂw;m3?:/hN0~/o!AiCE3;؛,;P *IMC|'٩c A>t`ɵAw9ߒGwN>XN]~Liyi4«3"7v1W<0@z~sۺ'73pJY]-_AӟGk ?ZU!fOV$[ϗrwBp*FGA-x0\?#?{^!C!ye!@i?uAh'A./u&uVlee*WK3_?٨5Sqck.g}9ĸ| n)@?&}VJ% hsCϢk;gHc\* BR6w}D9C|^}ϋOQ20v R.뜵@7#o3y^#%<5*Nr6!1LSLbHkԚB'D 4B̻(y`S:P\IEZTE0)ň) PrWm"kv78;{W왋]B)`u~ZU7gw->rbapSW_E¢K50XÞjV=$ ƒX ƀP FD1K0B;=&ԨRFj YB55L;/5ޭz Ǟcg;vj0: w֙ Oi[G}4(:I$o珢{]w*6mÜjt95wJ"xӺnl͙-S˶L>[6S"@;;& #^ܖl>}0Hgϔ]4+ȅI6}^ F֗c[Y[rg)=>f# y"Z"Sm&kڭ9S&x~bpڭxڭ y"Z$SFtc8 ƠJy#:hݎ}kj.$䍋LLS]5 lyFש?F-;Mj2ok3-ҟ% .0g%pXZ_hFl(Q.`46F3ܑW*zO5ԂPUyt_5#FLN)h-u<n6\k9wբ`.ߎizf>#97)͒dP+}EdE(JRdus:?<({1f GW7Lڑa%5;ty*T1o U̺87tIFW UFJ[аJpQ!xހf`ֱNX6Isq*/;0~}m!\)mxqsCgח=g}ۇkOeԡTKqGcUGI8zޡA;39|6Z?C"7!,p_ݦX4]Y{CgmtM@L:ױKD_yћC^ ~Ox;b^Ŧz'yɖN]ۛΓ*Z?< =PF_KĕjN|; d fWiIJQY(OLzH7LPI;RwY̙)&k=qsmꉟBJwρ(Ipa*R 9[LEHhtv}ljy_&I/$_&IL|Y$FYSO< QQziƴVԨ1@ gKiCH^]U>؈NUS܌K /o-ZH"xw٭ekۑrN|-f+ՙ\32NhK,nBIYUm6"Qϵ=wFA$#W^q C? De`Y7:HE^)kŜFHjj3KDFj͈vf"T&&z3QG£ GacꪈPPYu*K 8j q\UiE_dAd|'[ N16'=Si5΂0FwCJ(ӡNF-;yES[?L,bTu%%j_u3*22ʱEےJT唭>yGjb>_.8~3#w=TVqzKf2w0L  2-vI7"5ݒSqK@Fs$jP#4HEzV{X)i>nd108rjSTm#T"~qE%=cik[Eȩ&O1!#N=f葨$3mS药1usAůxPgٺ[QV[8{/+<>Dq~_+{}6g()e}(XF?8Qm.s"sDFij{~ޕx#~/]/:xF+G%u곪\$ &rzk04Sv%gDW6X#Gײsl7&[f˫WM%\USs[-訅Y#DHgsc/yz%Ɗ#2^!2 4:0y $F1d$Qz!V KfIPjDh*`тtQ'veoI][`Z WpB;C8X'3.I3/DttJ_TTYW)e. 뗓bњhg@*pATTz)'q@F+qqq:- 2Jh GQCIy+ n.@; -4"? YL,TT gYe rJY TAГ <3$( x%) z(861yTִ|봽 Yŗ^um,eMI%6-e9MePOѤQz[7_ZǾ4#워圚gcw <:ku7.gP.^ܞsS(dtBQ ^W:3JҶﴭF i̘m<9r&O6ɤR' .4Kj5. {\z|Cd>P_:-fTZ|U<1ξE~R !%^lˢy&v+R+GAUݼd\JPRD]ytLrong[wF TS'%,4 %Ggx7$º\XY~[BXP (4K{P0\,kԢcO.0޳cO^H|2{ Bf-)BJ2Mz L<}f<Ϟ'm!\1ؗъLOzhź}p((Lܤ'|P f;JO)1&7i+ zZ))ARXP)I eQm:i%y/r,@NJQ­ ăa$FPy:V,P R3@qWedt#Rkd*g$v+L8Zց;\i]&['$Le$B uT}EVI_j,|wc't~l|r28Wݢ4>[U5l\t.|N z;sssp>Qs)H>E{=ٺ+\]vɂPPC,;:b'T Q&p((hM`QoE|k}S{%< EN\1< |/3Se EbJTJ@<<(d}.j_:"T7[7j%Q{H$Q TFSQÜƣ"BWTP*ծj# mYGhYyJ&NR),֩ԇHǩ4W )ԪMU&(T(Ӳ Ҡ<eEޓ-$+ ؆ȬCY:awb7O㘨%z(K@8ߝՠ&'@zIYWeef," Y*, (s cdl]#e*ׇvI)D&w O|"EޕrGx`=7ƒ>]2 D,khpr^?-p(_7墢u%)sg_.nn7Z?iRӛf>ۆߺ凬yՅ_Ru]G*wS킜%b~VXD7 ~Cރ(͔:'~{3$Xʐ(`n?Q:JjJ0#G1Q: ^-s6-6XKX8>mA=P?jvR/Za_$.H'|j՞vҗ(I` ZϽ$IY\P]o#.57XqDq菂6(yPXlRҋEZ>a>ެT- E%vUkݍOJ*Yz3ujﳖZ;2km0HFX*Q,bӎw+x!8@\OVY'L7s29SISu0Fo?i"$\ʟ4SG. $;NG :(vE!"̗!@%2rD2iQ cR3g Dj yrr6F ׉ݷ@H9cpO鞻L߇PD>ɟU$<)Ry"‹hJ W$ArVOXx %`A6SP{]M<3Ad#28Fp`QbTI`@NЛg/v"ܔ",$!.$/6"gv)@r cIx^[[rMTQ9B\S&@m%y4N%no[cȦ*wք%;VO0sEp @ʿFNB|wXYߚh8 vzDnС^Hn[tÔa_QŅ>7~y}gUߩȲL>[7H3.%|Md} f@5i=:i<~Ys8Ms]hYruWŎjћ@izEoa/tdNTd. v"yKc$-<7aj)!ɵO+;=|gw¿{Q(̿{&yz娱V+u!YÈ$eEt tR< |3J܃cj fZinˣ# d%"ւB:t9 Z%K/kpјgup6:t0%7,􊖆A3␜En )8j% yct^ Ok1%51@Du@A{"ngAt)L0aumWڎ.beG\[!@}An~5%G>7F T2KDR*YJA} 5=d)wf~n|j@.ORK:[!CJeR@$/QJJY) H+lHĜ"{tUEPXzpp=AE2w[w$)ɣRN\a;@|ޟ?wցx?-ryc{ D88K6!X-,izY__^0X:v=,izvn߄H㝒^t;pt*Ѣ} -7 f^Aax+#IciZ xW6Mڦ}acgK>흨V\ ߊM*_k>,=b>15j{;wUR֨>GHrc~FdQ{IuwfaFEQٹQwQ(8D_F] #q [)C= 7w=t$5An4 % c__w١eZc>zC-2F|hHs+82`I&t_; Zch8%톾KiSkgk8*l8r"SF[OA A }Gv@f66<{P!!\DɔG1 ↜[L {d, h`Tp_vݗH8t\__+ze*/s]u77ce6.(%Pn\raaZʅtLtٛؽ2<& -牣=.sV2ߖHN=_3Jje̕=$c90IqPl15L{d˘n-ztϖx{oȼGLr-/?S9OݵawwEr)GI75y*X'Nٙ}<^i\ ToّN7w0ssWrŏN^><1;^=  mnnhw.+]~VlJ(X)mtdT"orRL޼}۸tW);g4p>Sȱi*zLJ]-/ejۋϴL>߅î xj@uK?q)Uw t?IM!#=uS`N2G" i0~C;Z1G G..*If*Y7!Ovznr> Ó^e=ce9o'}z0v ZmaiM*2hOG\4Z;:YXBUGrꀂxI$Q$)IP3P:AJL^'$LFZ Aq~L )d^o3N#ʄs'hr4IV hmp!֨fl [|aVDۧo3iPӷH wը.9(*0 ͥVS:Sd=" W֖jxl̖Zhmߺ[Y!߱"]4GPξM̩%;.ʤӴq$r"{R*lf+ YuM_ۯVJ7歄07<UWPޙ; FF;|aV =0<ï!=JG \ -=`Cp% `~N^e!r!;ȐƐ*]Ak"!zGv銼bd=F+Td ЕR)p ; %+0#q]f+0T^Vj2Z5K S倯7)\$цZI)&gIk&.kMN?sGG-󂖽PReP S/u͎}ҟ'5Bpf誀ԚcMY"T)zϩTՏs(8M nеW]۶R*6k5Ʊ:$zn CR5I<7Ɠ^]R2=$8s͠DH\; 5xfx;pF DukĄ(`؀Zbtf 6`\JF̠,b Ѳ*NRHQҠgƐjN6PZ䈺KN *9v4<M W 9[N U'tf Ѱl2b 4V SP| WkU#Q ;5xu!hdG:qfSv5fџu38w9M*=.@~C\Z\ZhbѤ(:\4&7oSKՑN8Ï8~ &Y&~'6UIBO,cwCMyBRϟ@&QoYR?ק-5Uw2K};orZD?qw>i<{& `'hч<`Zܴ8w%>jS&S>-5͛>Df ܾ|߶`䎘VWFstfsZ}[8x`˳ C^Z0%&F3!r@92©gf J ՖtzS{9%y-adlŀl% dP%ȯ##Lz_ɔm\D\0P["[jB\aau0*‚1%b5 f,ZK5ˏ0CH? w AglKdЌ;V MyYf<*E˨zoSMQ;襖LDy7??x ~HrdPrGpDg)) AZz7z*U ^Ow3ɯQWqyGԞ߆C)&|<ڦƔ6,ʆ]M6Z;1k^f/f}ͮ.֮5j)?E]ޠrƞ:csYתڰkmcwP~R=|:$ACxxnW֖3ovޢمH)G0~q!IA2Z$ƭ-ߴmJ]ESD - q9Q5 Ԅp¼J2AG;#x%,Ʈؐu9R۵+B#C=;fVZpWqfEu]yחox\][fT޸ -0Xomvnx%ǿw;(3Atr\-R<<}: Iɴo:j$YJ;EI?Cs~B=Ó3|7NS)CZPSg5&g! ZF+Lآ$ % [+Xh戛|;<KG#ƈ3D w5680Ѓ'ѝ1V-4Fo?5'&ѴnEc9A X%:Sur)fyZ2FeIΈ=CCG2382Y[oՍEjVq 0NWCxFUvbjZ麍):pRt]D P!*xljQ 9S5RGYI!şQ u8TgYtJj$.hZ+UQi$%x.Yl4Ք m@M¹1$SQI;DɚjD KPH6:SDWriQMRxقfUl 5Hn|{KwQ:.7=nS5Ȣ M紒 tY-..\\DE9)Z^.GF8Qm#2wF™[~6kaЌ|ne/<-ގQ٧'] 2/Qm(4Kp8댌puVzu<z OQ}uf(^t^2&˷cWwKٚRƼZo/A/N봏vޜ=iV,]P~H6vrؿ;<ݓlU|t̍=Y>|[z(}vzrlo UQݙW0iC2Øl΢…-ٗ3+^ʺg_r3yW -︼4l uZ-'^01ӹ/즜 :0zfy_rmwJd&uxF#%G7|=Yb7£߉M7=yV6L q{^^ -ZL| `r4+qk.1]f-}$Db`=BJ B  HM ?d,[DιsIk?.>~e262 쇬́ӏbI$GJȅof77WK}[@V.}[-/aQ0(}[Lj o}[.ͅd.r}%wk| <ĭq!Bx`xx:#ll:@N5M8Z}\1A֟{ ckNrMe,''{a'G--fәWfjs1F!5PpLj-Ob6mؓ)z#v9SD]($~((>v鴷0r& C#>ث@όxcDK)ʓN˄f(>sPﳃ 1 ST \?~5w?/>k@jZu {_މ1 w3d&g.@OOC :)/‚ͷlJ*4}RxD3"E_0FXȕ߮.6fc=+##NKĻDj2©D|lÅ %64α[!œ?09鍟<M.h]^.YlfO XhረmIQ;1&h_~dSA}'/ZfX byNLnѽVfsxTث&9U<ΓqsdIXIǣ<0&d(js|_&A~ OL7O:2k<\5vZ>\Oƌ4l*-1<% Ӯy|n`$vwUe8ʦ,<;+A$ ݓ}A~v +)1?&>}ɖc |9A$w߂cJ%V&sEl+4--.LŘx譀{"İd"dm~ 9>G[wv$W!?+&3Y|gXnQv9=\)Rw5ZhneSyYn&zDlFnKJ3 ;Vܖ:Hx"Gn0sfY=mB?9[[U3EAu[f{F}k/[#'=D״NUnw8$H<@ZcĠ-`s6zN ])$%'Md$=>[U=E)pl(Dٛ0kLɆL `֦5pHⓡ\XA''#AY8mj$du)B㜇.Ic@԰1UI1 5a*hN=/&J&DǦ]jYɳCh)mjM!Ae'Q1A,| OCc*m6N+PrFCSdFQU[:.u?Y+}%AyDѐ+A\^6Ȏ()sʈOJe C_HèʥUpy!+#p&#)ji{y@f/QᎌɃlVru ^ӥozL~*Mҗt)u>}Q1EaI)>6=,ns{1/B/V ՞ş;3δrh $sh&o+D7\3_˔;|)E]6s>.&;'{ɦ&yqrnpvKjYf.RL ͰL- vE#a-ܚ-ק/A۟KX?~OO?60ᢇKI'1f!!j V}׹rzIpZzX.[| ]o-XjX۵Vʲ[߬.j4h[b0<1MJ>-Z`vSz4>&~6\Yu͇`7m!@hƂyռRO4vmJ6^&2fɜVdw]wxᆢJ q3GHu&0p325櫉qe.sy䜜(M!T{$%9Vn򲙂˸sנ;eKryC-,djm!"8 :o5 \"U.ڎl3_GBP-,)´('qPvBvvH@dJEtY$F$+p #w7.zϸWa"ke*&K܂ZQ&\mGq #n#8o%lZHcR -k!7iA"Y׸`ZV]ڕWÁHZl>+A lC1=ۦC'dN+96$wq(b> ӥ|[}UA~Bx%ű+>ϛo}zľ X֛?w>c=a'^RmG.1eMk5H4] SU1h ILj@sGn~M(s'{g8J& ?_wƲFG@+c3lT^f$Rv?wL0!Y=oѮldÌX۳Ƙ313~NO? ʶtK?'1faT! `d%&ԙ1I2I*U|”CnN>WNgQm%7U7ǯƳ~[iX XVuPV={ܲVΣsjʃ_~ Ν$ϳ8/ BXk(qvN@'+<=} gӹyu\6 &_`0Q);޶hx-kvFub3n:̾$^ȫ[b}c:%ys+ {v,;D ӌ yC?G+"{NoI`ng}Q!Ft̿1B>?́_;I81.=] Z[{'@Yؙ)aažČq0Q6-+A!.BΗ+<V2K{Nm̲2`[N ve"r F%nғXuzj77#6Y!iۘ鄝4)@hLt)ֺ;F}4aȠpLB^oDfl#,$>LW?"Xj^y뛚/#>dN}l{XC|9ҷ$cvS Ȇg!v31K)fĖ>]0`@2 3AqͩDpd ִsfPzxwpkFy# 0 FnCW*eWK3\rջXnA|{Smw | Sǡm(um2A;<1Z%z]j=7AEE=jv;x[D 7MWk!6o5d5o=eU0]d_Yv^`t. t"&T ;ha9$q}f]Ʊλw ,:(͸j:޳} $;%~F%61 lCl K%oxl1c jγ%28JSѶ1e| '1ՊSK&lA}p.vqEEYc~2b@ kߧv3}ߣ1bQv*vIV 9X* F#XCVEtY jy2[R>V.=ycBޗzPɣ"CK/_q{ ! ӕGc3,;]#_s>[9Asd-ycA=^JخDw5`@1+ad/ڗ*Ige&\; Ɓ~\g^Q޸#(W0K[mɩ(yt8 Qchձ(0d]QQI{4jc ;lJIJ!Cw m:@2wN)FZ1fN`Ŋ{x!QoxTO2BrFAYGKTJp}JYA>4Gr@6"^ɼW_b^|[8;~PdR#yA: %^ԙ@g%]p/fvpV -*g䙸ҵ #+ q~;>Ůbp 6f;n^W_0=QkWi2ڈ?[)m~R(y~qc&E Buɘ&uX ڥW,-;\rk*p["r71v~}N>Xua_830xWZZ ywcY1Y) O~^@ָH??>}?蟗ɐD|Yѽ5MKڑO6`ts6Bٴt^u쌜U6ƍ:7&6jQW??'3I2!N^g/_6+zvQ=2`V ow3a7gV0^hWTKyPA> x N`i%|=T}ga$ BlW/UCъ*b' X|.kw,/"-vLcqgAփ|"-<_GEW?W>*"+~|YA.7n>DE?oAy6 B o z÷һW>Q^ssaaItYkcUh⍧kY"|au7R}heRSmaGž {z}0er{z3 WOQcAѷJ!XR=|CXո`ʷ{ym(yuXPլسžbyOX^3}ouJayڃU9q?ʬ !ݷb_3_=`͠<#`ON2v^5{vp25~f#x;{͵~۳#T?# K+,{:gK{҅8Ђ5sWŷQTV ch@v׏Ͼ?.u~}IN=>=Oe-|>.HPTl"n LVi;CjJ`t)%X:gmar_E6~aT+ͺ?g=Y{ml1oZ)v&·}HM@BVGMjC,!wH5IH:Ԉ5]`l0sSz\2R): ԡd+B.AL 1] uF\0yZk4x0,%6pŠlOW0*=]O 2.ڛֳ䌔t<ʍWާ]m8+F9{S0@ QTg3ߏݶl%f IK7Ŝ).Be*NR|]M-bq]GVfs ƩEݺD_͟ێH8Ù\]_=.k183F/G6ñɱrƥ3=3̡hs6V_s"d%"[j,sem= ^cCĂcNylcdnkWOČ\.ire/kʙ06z6Qsi.i<]q/#:PwB]KnT4L(4tF(+4.)S s؞ݸ3}^Bw8_-r| Q+f箿Nw8秓/g5ɦ_f?}sKAC3eޙ)Fwt(}\KSwSm34ʩ۝)5da(NXiwLq~k_-5AAP5X-L'|sZ"#IѵB. |y]Z9ZW3H)2e_: ٠a:@릛+L0`p܋~w_yn7 r8y_|+NxlC$pg0Ʌ RWsp#ZZ!ou`!2-;Deup <)œhXI>N!"hl6^"<|Üwm d}oȦ4b*"ƨᢖc{gݷcpEz "j}eK}7]D5"3^B-Vw/ؐv2T@ Cbک]yWw:hHt:גzީ=ٛ ,Đ>鏩e[S|;=sؗp; HU; 4_qӅ4Rh4KH!m',WN;:AYi]xwnOwj):ҩ]|횎(Q!1 pD?7Obq]3(]nDHlf#/ B" ht^̀*0cá&"e탢#fN\|Hp:#CH$n PbCD 2HP Lۃ9H8Rn1 1LSٔZ$Vb~mZJVl\P,z:>'?o>N;p)G/>o`ɢ~˭Z)jk?SmI'E)ڰp)(x:mb-v'cgz=p_n OI!DuIJ3_+'?m2``w܄dbIBA()ĞH)GBDZ1$%DDƊh$ |U!HP%jRHi1!9J %X T8Ҹ Lґa|;&Z|Hf '3OO҇V`vϧ_?<6yL1F|ǟ_b *EoW;c?t ""o4WlZ[f>>ŗwZۓYFN) j2߷ O'H'" K~ om =Ƙ@zڞjgnj}sv9L=)Rg'ilŖYb}k3"M~>ªeVѓ3L~%ouIџ"a}ųLC~sɭ>p7טiLYM *9nb0?s*Jp9us5ffN4bDXOv(%i|cefsĖ>9ByrB{Z[ *0*#}R(DA"Y6!S4*VNel%TlJ!i,3){nք x{ZLwx[~GJ}9T*X=w]`&uBfu϶$8άD1Zacƈ i8vr );0gBV ".9h3bnyR)5Qp 5?m\^|eE 7Ȋ?+r8}s2 T(۸{ Ɲ @]R?;%Q7$ARsFsv!U`/z$?u<Řb/d.ד<5⩉4TRdHK(5((fIN2XSe2=MѓLh$(-R$<"NL1:N$,éLiYL۠0 >g;QdW?}҅;v5AMN;> V;: ܞ(0 xw$0U5j{ _,C2ǣww!ȌqˁˍdCW3]hE _6pv8Xyg 9Ok'Y>57^Ү{s{BǧbIDn0u(UE~'6HSP 7Dl9qYܟg؂-gxKquSM@m3T~Ѷuo_![LR_%OZꪟntRqԐ_J~E#2HfRF0wg.@}|E.qhRT!& s.Ǔ>4(Fav0pbgc:NJCh=VB0&S%މi!q-ךA`?o0.noc(9zx~F#ܞY0 rCheB~LBCD\aқ9e 9'Ya6@\95Ck5<R߰&:VPn e^jхͬ!ҟryw>ёۡ ( h'5Z%hU?NcR92$jI?;oO,N!tB˨e;sy3}437G<;ņ}/sypŸ"#0k0dWd6 ic,EX]ѵ\9`nḨ(ݮhparbBEx `,ﴞ0Vao\9 S"ڲ@BJd6灟Q#('k]1~IKZqP-&`A+Gb=Ks}Zߦ 8bΛv x?{ƄЮY}[|;׳aGfUA"Df+lEom gG=(xS]q{W"qB"8IRId(:Ga/}zT8 V?'@/8DtOM=.'c|ON|foob+'d~VX7kٟw6P RJ>R?pU!lrn O\!i##'F]|iɎgP-CQ=.KՕIYm4 ]ymq^RX+p< X^b40@( LC! .S4`{`@=901%T0`)" s9SSƈHD!XevZ tb%,&SBtʟ i-c UL0g@cJ2-$˱LjDyc H 3BVjvu%)r)&1ʿg /=}NJ0&l(bǵڜj2U؞3qES`Dkr$ i*Z'X)!kΜF$hW  D]&eV_MzL+:u,TE6f<B :ـã]zdut9d-Q:[]gZ 1j)\.(#RqTjQƪ4|=.2Ç95xD cБ;[E"x7v6ϗo+knIiRu>[1kG{z =OG@$@HaJdb Jju`zO`Wk# յExV j§Ȍ-o͊}HgǕ.;knkJs)dٔU,41"HL'!yʪ9'XMapq3^&pFc61$ Bq &X|#c`t}J؍Th>=%!$r#8`o+ ՚""JC" 64T(o UrT6-tjc'fރgTr,nV@}]]?8V0CmE7#*J`g l$nz6zE+Ta˟O}|=ɫG$wnR8FZ#}=kOG|BH,[A~Xm\Ώxu1kgvp':^xm_1&E#ʶvϼUh,Ot*>B8^z,w?ܶq5c*{]OR7;,&!WknqCcXyk iϔ㖗P?fV0m>CH!粽>&EZ{E \esy+He*otqU+ͣ5~|ޏu"Hqݡ׀c]MsU\E3DŽf_ƃg]:k{ pBk{]*+@S6۝?L[;w[ShV\YΪ3Z`DKOj:&tiM Z ޼{I]mԯH0FpG:b*NMԢb?g8y޹KHR_MԬ%'ujۆĻhpEx.*(5n.3cDYӆM%5<'v[!u=XQQ_Rϊ>re{a5G\)%mJ/!uCS0w۹e Yl<mg'9]\"ԡ ci5Yn'ОnƜh;'\Fh1**d;h;8G=6]z~K%6<\oͣIaex::dg:0K wfǺf1a;eħPE=$Xi_LP3HGG<ĵPփfVʏg(TuڤO &CXlN;'ޣc՞K}KL'|$7k . ٪XTa qL?K[sCł?me wMY,,GXL?0mIiY9CV.%!y[VtJ$TӇO RR3-) oDڷ1NN6&F["~7hB:h^;+yJQ$S>9ȾH8[پ&ٶ?t[h3p=<4%mHXgd[DhGlB9bKcm?3c@C`5uy%o'P0I9n >Ѫܣq1 &N/p4J HX;+ vׂ}7(Tf6̄#f1F( HbM:SuSmx )O.BZ@tG |DBvǢd؉P7->z =\=a-(nٺND$ƿv޴Py`Uۏu/>O irt<e (x3ɕjy]\hG#xS"IrJ#"RYyQӆɹdlUﳖ8X>z3]'&ycT1HBIGX`!js|9.X!UD%6Qa2ԉ8`0"bd$h\R8}}*&\Qw@%(;uơĠX7!Ek_=c9c&A}fӔ]~ ^eKD R>~}2Rj&Ͽ%߭?$XM&_& < .A£'?pLeN·U3,ٜxӃ7"H11b}^X،q4D)#~bL(؟1Ɯrmb,7-'=|妇DBgu[ёdp/2[cL1#,`MEwGw'72Q9$B#F"9 0#e„QHbYM ë1^iRVt_3I"Ͳ@eyjHC޸zXIjY7x/.:֭CN4Q_AlۺA8Z&4䍫NQ{sϺIZ]8uEu|QǺs!κg8Z&4䍫:>`s]ʴϐf1k 3efak0wlCh*4ykU67\1v;;]\xC-G|kq4*[b ]\՝X.!FQiQLgogerYD7ӸS^ٚ(p:wZD҉0@ȹH[%r~_[:KDŽk˴gªػ la5KaYoeKj t;n$Edx NkdJ(n<_A i!uI):g !2>":%\NN&v8/~Q\2Fi_@8TQf+ )x4 @68>-Bp򀣥z\Auo󄰕5%l/FX5LJxyv"/,̕@ou)ߪ=6$FJe@u/bj1}v-f}-L+^,R F%qC>)ِ+pEW+~E]J] ġiJA̾@2X*Nf$"F8fw;7:gAY$>u'ye0bbIqlR&J8TP ሚj *AH;4}"!k g")'PSpH#D Œ6HGr:1WE`jw} H!IYD <9Wl7ykPH5L@:0셢@ 뮊VZV͊㊪څQ4v-]9,]}*gT#X)HΩ-5R0ŝQ-gROf_҅`9fb_> (*P҅kH-'<$|ǃ.֞9:]_lAvm-t'FQ- @!q;[20mwB]-mZ]=.pE3}feOϐҥGSIc0Q1oJ2ٍhJC[ Dfå0nߦlkyfĆC)I3Z ?ٛrEqG= ʷ÷c1BL,ICUz_FO`{;ȕ(/&-*4$![͉E8A$)Vuѩ)%D$LDlh$f>SA~R`K}+"^ 6؎LSV?#:Յk$)ftUM5">NB5rܒ L[,aRP%mVAO!D{)*/]tqGejSݕ:]HeWwb(㊨?2RݱJZwmm䐚EU~HI̩]}IJ@ om[g E7J]E===݈pš( T|/R08/VsZZ,jiRsy0,Y_~]ٹ_X|]SȻPMQ{5&BglqN]=jفĥ~ ˵X@lR>sըHY 9\y`u"9[ e{2$䝋LI[Nn&ý͇y`qB͇4$uCl !\D]d6jqn&#DqBڭA/S+ngm[ E4#SRQD%J!nD V$=ze_r#FJuo d 8,Ռ*tRzR Bj )=F0K5ŰK l)ENJ1wVH1Y!OmtRzRjח}Ĉ~N;t]7 զL'-I)唋(O}- e|DG9 7yP-bUJ!eĵM@yH00PRarŭ?iXU8C5ϞMߌ77飅$XT3gX (GҪSWdٳbr+;R Su6A S5.(eAAmgc؟BLso}`6\[?j*^/XU!yHPo?s2,Gi5IuM͊>[X_ca~pX!ZJ#hiҤK]) J.1ΥԂLUtD~D R9C*@g; e`() WdI}Y$t]fo~[-3{XΘ=9D;2[':-hE0R8BMXuLgO#ŤwQ`NW_'aZ毕^6擽j;.OLP.AR\&jiC{+ڗtMj^C .R:#=hiz)IE}?אTzӐBհIT7Mv`JrOBV!A@y3mZI߸XCr 6fC5&o6- (Opp ө3D;LFIp{NqT~ћ&vYOiRbB/Άʛ2ᗥ*eLѡŅN8yY83]8豿,r"AV |R<6ȤR{XA'8q!&Ki^! z/KU>Д7>`*SPTCgDPMw*TqSSKSR^3yzK0U (U]/+TR>4u/AҴ}{zMxhfGx@i>.'M#qݯy3L#P6ևf6k>K c 0jGHI$Ը@\T?:9@Щ3gCy5A-*#RF),FpvIUG":Zҝr:T%ZݦoiaLO`qñX{:?YEFB&C% n+:J+_s!T끎I*Q]df%BʼnpYhKVhc$J bUm:MoihMk7mg7 ?_9"ZFF@j-fIJU7jy[7CWɱϛ䂔 _RFOBB*d٪s <*L?LҤVaa dH>1C\ u'R,ҵ?u=cfӟ0sqsu/\'v31}8 ~n'B$r?v⯾&dmJ+rz<֚#X.{? Q\rӦY)GA0X"Wن뿬S,3H({fayr=T5ܯ14=7[с8,Մ/v.NJ!0mYH^yj3O-)3D#LwwHM bʕs@\R hPJ>F>m%90@@dc'!^.k<^RDj2_=s,x9Vx-yU28 ~ H(7K<ҖI)*SB¦[Rl1H7[,gSg@ +;{x/!V٭ 0P*Is" i'2K.))CXe(& x<Ͷ4Ǩf6LG/26NfoW_S 5Kq⇫R|Bcx?AFH_O)o\,vDm2MhPف|"7$,gN$bzŁ V;AWc8X+ &MFJMzQ==|q +mY2bԟ\_;[D?dQu+GCb8D9He^^[w".UrF`e,ohl4wM7JR +HƼ g6\]AivЃq4M>?0%OQ7^+S$UUf9\յO1D_R*aa޿:eQP H@&YB$M6`\ͦ=_ά~@tFڂonL'JO3Q3R@w `!tXi~b,-T-b< g Sqwϧ.xӁ(wwO^NRi;9ilMcԾ|[u0K5: ~0%<η|Ķ9٥.AR-tl8ڨ[`ڮؙ69෫fz/۩c$ ݘ3?&r՚Ay!4~UVy'"-sv%71 6PvwQRP&h%I;Q'"+kưlvs偍踎QD5g}L9M'LZj2$䝋LYjJiAFt\(ݚ" Ikng\,[E4#SHu]$d2#NcY{[:\3{GSۇ'oq=כMH*0;3ylLJIX "a,T{| 2W5&RbY.@:)h) qMT &, IK)"f#:#Hqf[/[JNg̊8i"f'1Մ eK)vRv!RzajhW;H;)%] hi'1 RIKы TqY>L'-RJ!襓R"]Zdܵ[Ɂ,^vrh&aKfCYuk3aYX\7r*+;َvws:"41Q [Җgj79K$Tt9*M'8Dh:vnpDڡǣ2s:&47WEƧ_WIXnvg|`U>8a <g|S6Ѝ.ax Riցy+Oΐ'wCֈ4|wHkC69' v\Ը⺧Xiuj}+lF5ja'JD:Ʋx'T+``x`"֏|vqSG3+`RVTg&#eYʫZ#;. 9a؅뇓ӇX)<^ue Y6+ 4egZQ`V[U),.P&@"uBWI;9"9rFt0EμHcԛNh يn.YP0T3Ļ^J)Cxpwl9={Ķvh##q|")9뗆AxȨR*H>U"р~(b*M2UT&1<a80 ϵ pˆ=Vfa܅LX&|IK^NrجsO@ 0P _x6D.G.T՗\_~W[DGjxa=zh twwoD@%޵n#bmū8`gw,v(IcOw; ߧnjuFKRRX,~,Wl4QAq58Igl)jϟ^~`'4Q1dFpu(B޽7P *^0[dZ9*5R~s;lS}g>0%.!,DqsR[)8ԷUlUbb'cX*.VzV J/=.mUjAJ/VzV *JA͘T=B K0N:AZ Iꤎ v Wemh 2*teq/&LK)PfSY"ĉi&6fҵ!PH|_=R(TPEO[ivZoF@npMƦ(cU(u\wCCj֝#g qk Ljf7` 1jAQ2)?o@1?M RšybџF\ttPGw&UM %{kۊ;ػOJԻtLnЋ?؋$LIO[O[];#K\2:KWœiSe)lc\nY0/ǒu'M͎-?)^l ydv][U9Nkzvҭwu{3pğW}9JiE ΰ(-7"p]q݌"TǷT-~vybe굺m9toU*)5 ֭ji'F/r`myd+gch}@sE!dQT&'ۻ'Ǚw#}zZ6iJrJ0Ƌ3f:.Vr%[*G1_~3_QǷjO_zj9MŇ'f ]5x+6Өkfa54 31qe*B|tЅߗU3B8coqP)wC4?Ny=lIql &}0[PEyy-lԥV %XjXʤ SmSkeiSds",\$/*Qhy3YT>g[ In&>p zƗ>Gih*9YݻCYío֡Ղs7Tv XΉsZ傦6*f $47(*p f c <9~ml)wjK)?cr} 0Lx!{Jd?Ns!3Ep(]Ɣ"m$\TIKTS.]2$:hbhf\F˭A9|.]b7诛)/=)aJ(I3aL E$1ę& ?IBʥL WIdLkQOP 4Da:vGP:31JEgTk. U4v ;eRJڹ$UbA5؂˽z?>8NPfnbzhzXRvw5|rjCuwjm'o'n&/ y OpNRW5eVF_߻dfa<2Y٧Ŝ7hfʟ~r+ޫc>M>Ā,\+3-& ZՇV>F5jTt!uS>mWs3Wfa)^zJ6|3=y o@λl6]z_sd9E`\iÁ"*׷m?UXTfWdTϸjD׵>_,_f%5 >aZ-Voy aŊJPhGqz*@ R|G <|2yglx\Aɳ2a;q|N:?Gv߮x94M ܷG;:Spi pMz!j촃yWbCTK`O:99yVA߻A`;V\{^-1B$ox`g00C/:Յ$ðsBl}rv{lPhh'#ZͥVi4ͥ[Hր,:FyC+9k.5HۻAOy2¦4_<~lu觴xR`a?o --lٚW2?Dcw+y}=ymT"o7٣[h?:-(5vx- I3>`vV%UwZwnQ6Y x@A>w.:՛w nmX;7(*5n1xX BL'uې-ͻ nmX;7m4̦(1 |nK_w̕ޜ[qt_|3 '~+f!SpuK?כG_|BP&z!#LvS+??~+e>j㘰Ua(1!AUפ^O^w"WD$ڐ!0J%1*т CU|4o[LR4)=ig $f=ᔖNwqx>i>Uq,)D+_E^a0j8ک2ibU6)pt<W1'Q:0#T9KO xƗ.E< E,>gIV㡜{^ uN6hƜ:H?e{W1Eg),=i-EDŽyN'"=0;t^@yٌ(Ne JSDh3.T2G:L Tı\W lT9 }ha;9aLR!dΤ(.ɔiJ]VseH-,>Pv` dHK1)М:LXI 2':Mw1i+X&I1DKfJ`Z !~@]%ZUt1*Xl~nz A`Rq%^#icsax`0aƀ4A X/zn>ƍWuŅ!w`{ z|eDox$ÒsKz|xqiY15.!ZθwNTpzjCtqoxuQ?zN;o}uZ/FKx54j8.G޿HY -L_ BiXNNZ'5 N`>>FEFE=46*ΰP/dZzj-hGH3eM֦t{1s0힃 ._R\_AC %8#_﷩1_ozinO#$ Wemh Ʒ)U,+-&LKBpl_݌[76|gxojޑxƱUዿϸ'igǹ&jE ;5!0d Lok4ɓPcD'n_;DN#aric8F>#kdd Y152%Fii:eS/J64E3SQu }w99[HlȭT9 0\;>.~ ?8uHsUxKڬst4S۟k0];6G> 3BG7`?〈.toX_W.!6@$"?1o^Pz 1g(=l0\Z'>/82H'q^ r1-B HiS0I")34Rn,䆧:K yat7a8%˿ߤ xƗ:MmǰBt}`~s#U.39W>!%xS~ 3"Mv.oZ+?dxykGIK3gs\)fOF%J_Kv|- Nu|O7!W]]OMH NeɪT3-Fktn}Q=7P! oe܁uu#sZܤ,VU䄲9F"y&>$ B:%)n4b`:ϘK*ħhXR5mRZƳ&ڒbK +?9 VލVfl+ XCM*u!ý]T th>H-9eMJNMRKU0!mVFlTkn8rntZ9Xi96ӄ Ӓ+ Zrq8x PQtA Tr K /ԧ)!Вu?? hä98ܖI~ gƻ.>$_?=2Ũu]QD͘oI0;-䪧秪@.^"'d,,ژ!ДHUf/9 JxS!! o١V-fuNzxơ*%; TrucPK(L觎^SAkf)'5\V9*{-9_?xxsu}{9?esRW%pC Y|s/^%6 R+jC{ imp/A\V`pS]AEySeҎ3$&wRx[m."1) 1mzշ9 &OvzeM-Xב eSF?'pmR*AOIp\#0="-lQ&c2eq \(Buld`I\AYҾ bq>Ij8RS_OBa_;.}]%t^{: řp, X }P˹u|{^(=o5$z. F!Tͦk2oAZeB`yjA! 6=E-  1"9bL<~;Kȵa'T17t.ӛ`hSp2ZJy/jTbqSq4R RY s4w9ir5D9/mU SwfPҰ1, 21Q tz"1aqu& 'D Ԏ&IkѕXdgm ݀V5Kes|ڕu3+duߗ8t.u+T~jVL:D o }_ &0rAF|B2>J,%bJ pRe ),[i,ZZZbbH451HO<`.9LWs+ϐKB7 s.HCy@- HߵÈHdgRjuH%:);&=iۦbBB^Ԅȡe^p8 , n!sҼsN>}{9s5jOiK}28dƷGW`24*8sg gޓ=FBr)`N닕s=<{]ZSQUjԓ<>Lp*:("BL{3+Ѝr"3(/WPMtv2.Rnr $(g`1&y4GLJN£~TZ]m)Ƙ'p8'0)@D!ՒQ&:|؈F:D*8mh6V x%SZc=%{y29Pq\:7tU> X7Dj)*'C`!Hb0 ң(0G,!;|Y`/5asO_Gγ4ذSH~ړIu+-l-[uzTǝ;Xz)GJIś%:I*НK̓J"tĦ,EQWT)d=r*z$Y_|aUȌixlhj3WuB91y\\_(&\a}r9Գ$ ,F )t1Y #<+Cˈњ-Z.qP*/K"qMx]/_?lDO힯O\`ߓ=[K6CMFO 1wEHBԈGJ3UE\a[6_@5OQ͹'I"=Dԁ` r$olCbY40yMqHUs5Q4@J#k"5ܱ-T?m j1Tv8*;73eupy^HLSU~xbj.](_농SɱimIz,EKF :RLXoWi"pLewۗns2U&Nˋխ:j*\ɧ0fy̪߆/iMbQɏ~7Ow0]/&~`5*?O;0 _jn_%=뷏abAM$u_ݙYn?ݭ7p볯ϤK{wviVR(hާ2,Z"}4 {߽FD5x{_G ?E륨W{@.0,Do=1 mo iZj}mpʰhcQP&'ULezrOf*aMAyU mUdg9= )T ,lD0}P@/S{K-B6Mɥ {hBGlީAc`LSKIAeɟk漜8Y%-{<ƞSj灤IDC@#/ &z!׭ hfQyñ7چNu+I]q_~KW׷7gWG[=͓Μ)]d A1Ea=j]Ke';%ovi{n&ẗ:GL}Lwnғ@>C0aDOq8Ip~HKpL9!]ZdaOzxvҊP?=Ũ.h*%1Rbk wI.b.ÁTW3'N &岆qUxTO1V'NzD9 L瘍`(?L;E㟨 #ڎlDkQ.jy+E4vv*0W8=0]J?:KLnjG!0;\qzi;:G.҄iTv%L8 0*;:" !'KQ6tϊ] ߢSa-١2:jv\^ؽNrƈa9oÂ0w+kbjբg^(- f%JTz0sk%&5gPԊvþ՜KXۘ  J.$Jo'z{å=Tk˔7pԀ92]ը3.!?yFlY81woP'@sY\˹Hz/jV є{RF9e,#t$W"!Bu.Y s}Z$l 湰A9~b H_^:`ͪ35L'O5UONz*JcS-aeP%4# Lg]YINm8З\ /9.wFCX$[Z$n~CI=!fd^?QykҸXF5 #5%˪OJuWhTk&{NW@a6 ~RׇI]>4յ| 5cT=|(#h_T!X x&uzn9|@yNm  rp}:MLC«/_(e{R \N^%e0nJuӗ%!|S^ysܣX\TaZUa ]$Ie/89h3,mWxsiWyj[<|o4\s&JG6~RjAKY=G$JZ=G(M|^ Ӱ6|~J]izN'y3c%Ѭ;R!rlG޾'Ԓw0F~|غ h!HNt- rȦbMtPE sUk ?l,c,"rZvVJ\:RO۟ Y5ϧ] ;XƄ Ը_9oQN-W<-"R) B꠬Ӗ@`,g3xaX`&Qʥr-ڨEͱ%# f~Y'y/5’+15*÷LeҤu?rv/(^tՈ# ̈́xAxnޭ%Ȝ4(ޭ[VL0ጬmjJKKxs' *Ir.>46`D-*4DӁ$ac(F".ojK5vzޭ9?뽻K )1%:jim]/IaFS#Sk!Bm!c! du]wZ|L~=o(J rhv8PoNZYt~7>6M<XSG'KX~l݆,uxX-๧lhQ'9d-gI1.sIc4/8%FVhB0h^:E<Q0Ta"JHEzd*[6bv*~<+|ZڒӰҗF~sԒ4IV~ȲRABӰҗF~sR/}V2I.cq|L+}i7Geg+}V3RNƵ1gƥy JߞSznuRvoRwr$qn&~%(~z_fnC;ݲnqDP>Ը?e .n|lqjv>'ƫ =Q q qdl2`w0^˹ ϕ^VRT1me%Qi"ŭ޾߇\8$ ;k{W@bVɎoaֱrJԂ:٠YܓD,נ.Ш/W;@ qva:DvTv>{@4d絢nH ~5(G7<ύ2-xNh˲eК+u1 g~VʹT\u60$4GV M" j$19ZmDd{჉R@7d;Hq,pYؽ WoEr&1ܢ:da"QBpýGchL]ᚱhGH`!Pq+d-SDbV(:_n*CwS_Τ~˧FrS>oC4wSK]0y09 |WmP1|)P'FBj&DQ8J*54T-mLŘ+  Hfb(\kA1qDyb,bfW{aU0HZ8B$L) kq)VZrS{6~uKR dwO&nRcE5Gq$`nHPޗxV֑NBZ(g1% ť(ߟA,kzЃL ݍ@4:Z^_:QpI(ȧ ?[TY}K+_m-̪p&d^<=wBVM0XXSo<770d cʙڰ%r@ z XzG"E_;]!7 nPTkZaȕ_]cPU)@V{G-m+808SQb;rQV3v3Bǿ%ׁivXtr?Zȏ6C+be9?ލg2kݪ‡/:X~lܓ@HUAbݸ ϿV73~UzYG'UN~H.h9ga8MjE˴HL^ZHr&}۹431V2DTsi:K8_\/fժGև9LRW݊Ec%;'tIM=[I] (FW!b稀 gO   1=SJN#EiZn)\Le&=pj%RdTa.FQ ۟Jz&5q 7nؚe4H5vIt0ī9J¯pJ.*Q oEءٱUYf\푍w"cnv` r#y =N@>G/WX7WfFc$Pm$LQ"%9o-GREʤBF )^I$]27|ac;$꩔ToYZ4XoHSI 8AQc(*'a Yl8a` W& KbHZ 6Gy`s4F"} "FNqU,X=H NpK#jv0~73McÙH82elZ kwI5 ؃ 4NYb%:PnlŘnlL;{i7G0>wygdZ)3*p#a(nG-F`'6Y@%1 !ot 0Ƨ*G,q˿^兤Kr-'\+%)p,lӃp0:bilQ5B'6Ғ"A:I;),jGNtuL0CI18=-,t`"VHK^ZWM\<7=6[+Nk7*szn-iFz-;rJs4HH$J](+1n?E}f5?FrwRyc6,*W3s7*\?bK4(T"Nx>|DώCvuّ̎1;MLJG&QKr0-F";48oJ98PՒ:1cK`Q.:Hԉ.|:XfEˬs/MZTɏun?Ū"@5:쬖$J* +,|ns{yD3Fhr;d!XGbzYJz0~k rJ '*aZ⮀PU\Wy[(9 KD[r690a@V']0FᬨH8'`xCޫ\cH΂NE`_:P311`,YMa! J%gM W VEU! cʠEɑ42! D!Bνp [u,qXYS։Q}N/ 4Cd\MJ9` XބC<(Z)kPSn@i.tHFCZ zX[)gdQllP,֨LgEYhF_cãThX¾C (7a*H(x!J\$UQ*9R )#&12 pv'iĢy=;" ķ"):R }+r0/µjZE㼽wF` |q71߃{;LgK|7E;iPlۃws- XLg_/{f̣8„pz6/Ɗ 6pcM8xcD7>bK㥚1+ H4AD@H r\"Ur%wz(Lf eS-dF6G.)wus$9!nZ)SP۔FLa6MC^6n4-vQDW||"+xz?]Tk#m&xN<Ӎ[7l{d͌E1śL2M;vY(+̵-@hi]5j9*) 8Rλn#C6+dE-Yke/Η|op{?=A|)~LQEeyn̨}ibsfԨ`u~8 R RKnö*7p>\ӸK"CihI3B9S('՗LsB6e\7nS 截& ƂK܍e6f,ˇBhI"}cG!)Xtiޒ=ID1HG3F, 28#Lc¡lHvo%?L[fEh1۾ToSm|lTw;׮E/!Ȑؙd@>,~e1 PoS*ö V-]_''^ fN٣FfbNE2 ^FXK՗ڱ*Fըt*O ?JhgLw(AMDuShkFN"هc} >nG#ò>UK0Tїƭُ,;RJBk)?/Cv% W/r57݋ۻE /n)5+jП4+M‹hKmӐ>9WRΫq/ꊫ_Te,Flj0.m[{Xq~'oTuݖvZCYPKo#^BI/hmMva\;3?Ij6 ? ,[ΰhY(σ{:}u6r? hP8i|:1 Ď:1o̯o̗p)o7(!ة7/$(,+T#"46LЏq5 /џECUc*uǢ9ڰ6fY1zcO|M:*v}թ !j4HڪRaZ}2U,0UWT箨vSi.(sAZB &g %.KxFN Qj1k_2xD ;$5( V<gx{*V"1GȀk " %V\nA29&! ʥ%3d MUTpCl(.ñٰ`h6WTRP7~͆Y2=sSX]WE$q4B)$`Ppw/bx7" i*Pm3* }Aҫ@Ez }>eL9az?w5#cweYc(  ϜW'ѻ$!~_U9: z4lg({w=+P׻ [|t[a]BZzh/u>O? ?ݶг,~C_38pÄdodmжIk(?jĝ6z""@r9'T%'0x&sE2[drɖ1pn3:{L| ~J৳|<&?HcAy\e܏ôeYg<\"r%瑲keŃ}s-kvuM<"n2aLzzѧ0+rz$䍋hL[7R]nT9n /E~UѺ!!o\Dd·:MJ}wKĠƘ8AW绥.VqM)I )uKŠƘ8AH=떼ʾ[ EtM*g!ΕcluQT7W/.2> AJ'\@|*VNk ~85WP^vR> ;(o9?Yy@(᰺hc*9K \: ^߆@Ѵ~LOwnvlCP v/f4Vi(Yq 0> -kSCN?X&;d9ғq"v͟A2(g[Ćb1QZvb1<)'\F8C+%0=" "ܪ& P?ǰe G_E$8 {bb9,FIZ }4B_̷ "Ӱ_\aJgJK%Gnn+l~|6:33&.L8 B3Qzq("1Q0k~?OoGvbwg_b4˩/"s*_\,>~ kZlG{{4L1c3F .Z?{㶑0/]}C`Og1q0dF tI俟jRJ&7e&0bjVU]]]aAℑ ur.$"D!2]Q*3]2Naa\ >%=8J' \+>rfxc\.ل b' σ90Wx<0vsn'jDaQRG=X7t$\$&JEYL8. TtS!NRqjO4d,fķd/Y`(#@Y8#Q)[^ \ .Y x r0% OJ#>'&$,ݡ[6& 4+,Em#kmٚJRDm-,Í y&`i0*=I$lwUL+M]))۔N[fJ츘}5Ӓڝ &ܐ1NIc> ljSENNOF;8*|"N%:EV)Ym,)xhFrA(/W0ejJKD l)fǗ*tF*M SNQ@1Yy_]_"0Oq\gn9Aj@|"J(,GF`#\3mt[XrlrQ[gJ4N0cc4$+!-26`$GFKVs֜fUj_%/?҇M3TW4%b;·yo +.Zy&s5\\3X T]BSv~᧠5JS1}(<14-.ПZ̘ݞ (BFvr.wERpEE!rEn}rX=[uWĵQ-p8r5;kZ+ 2W)0nأM/2#+",g~GX¦:1>:ᣳ>:c~،lE :BHNWSC (ru,&WکAEb * :X4ϻ 菗ԦH_Ee%gU@"V3(r&66g|F7 (ags͡t1\TQ  70^MJ*f-)ϥn훂]]KAsv83Be4+L%sL"a}a&EZ*aX2`J0x}I1~zqT~qts}Sf%~:a5)q%L s]Kk+>]op!71è5xA}>pe@s[ΝO{}C\PҙlW %'BÜkA+ '?Zx39 Ii0&=xBruHzGqոtkɷZֻ!flXG*C~//s@ӛxMJaoVCNnטfKWFN.r< )F #b+DYKLdR#TD&uQ, B@.)MwF/˽_ή8d>ADe͔*DIŚeh 2uxP5{X@VH<_"CfFX!PٶĕJ?Z fO٦@&*Fӑ ovȷ !\lWϗq2]qLŁ!>_,ć=R^Lo6{![u2[C!Xl /ZE2K%aD\aVM[;B^ x,Juk,/X/yǐ!W\jҮ \xj>I-ϫ8?DޤBy۠9Q(؝Ok^7FrȘ*N~H4+aBTtiW 34䢽 й]4-F2Ҙf`WEcYeK;{JޯsGw|>9Eàjr,s䨞 r\;g ,*B!1tW?˾G;g` ]iQ(1܍k!33őຩŊUPPN]V|D w&ǣ3gdNbN8oM:)1lS8D^Ϭq֏so'@G.uWͳPůfb6^l^YzUpyw%UTQiz dQ$ R,"k*NņjfRI PˡB?\x YT ,$-đQE1,Q173L$Պhqc!f8W&&N !BJy[j5kTW/O5#r(oqh9_/bLT ?'Sj2?P(Yk2Z$5`J5Z(Sx-J`Oˆ0~FE9EC͛橹?7A.eXnef+3W1{kU WHNT$=(2l?~_~rW3~w~z1%PwFlbfƌB:s̨=S(C*t0_Mө $N r<0v;jL}csIBdJXR ڝMqDޖH0Da0ęA7VT ոImEC[5>^8_gϞly=K~TpG4S8P ,Y ECI#~`}E03Ru: aݝ.ᵏFׯx.[#3un!:6|ph_"N?U\V ] 8u5(݄I jΌmx}hbh_BH=å`cB;oJwEBQČ20#q:JS45bI*mQ".(QZ&t} `0 U7vp iB"i`+})N?4`1W]>&KJ-؎N0, IX lcnT'K+2 25j_(W B$wP.ef@PM];( Kj|yzf>_aPsҡ^9ZX>|~^ȟy1ٲ JD%S%g{< "^Vhp>!qb_g<֑bk9m5lJk#8aZpܭۗ hMQVHBtN)9g6̈́g/"+ؗN8]I>=CqP4i!v;>&n5˩ȣv^.+c~ߦEċx~<"Gdkk |=ª.aH &Z| JZiEZrxƙ" /@7ygW7>\+g?{WȍP/;X*G.nc{&.c$},LL%e/`0,_\jMX~R \ % S0m@$k/$btX`Q136ؐ`0;<WVF wx=jIl6F}Z,>UӐ¶} M _5)|y;$c^kx 넕+'qmEAsC27Z=lH|U}YA%9t$)|އH:Y϶g34|6z=Y3D#ٯyEN0E xn:U+2C$VRNۊg;;8[,I!OdN :;jKW\pƜ]Lc?p5 ,. Z vIh劁A޽_^N)\J^IQق+(mvhEFXݨcZ*[ T4Ð7e·PK:G糥01|BrϧQWƭsqj|!cMY ,P)GPKՁ $}=830ySX}0tYۑ/G_ǿbwhKe49Yֿ̦|g~X>?Enb~_䦜_dՂOƎi"9: FX4/c3ǣnt3yYcTu;z7Z8fe\ q[vG ,E5ܾ/w+Xͽ]1SW}Sr\|3:/cwuQy$()Wjq=lwgѳ&뛨oJW( 5*ȑL0Fi)t˘SEE"ODOe\?Ry"^x"}F1#L*!N`Y')d"rAK']k߭Ѫ@FkZeh*h-%^*3!e;0xN@34TX ݏr`KYE`\/lQ\VHH}L]!vd#SIFޡjB2R#x[ 6"H% ĩK"ҧhP>Qpnm)z$>OJtV3^PHC㪭萣hr6A\H `ȝqڹeﯿn_;\1AT:_y" E_Os>zGUy]OrykuǎO./k8ȇyE]HKg S[09+ D!ԁ 3p:X95OtCgrT ~ ~QUC~^㣷c^YB6q*pM{o}~=}ףfOP%_+yw:?܏\~E 뀀%m_dXv;dTkF4Z[wrιN@jIc%Njauap:+r8VdN'AD~JF!l"ЕsjsQפ#B8ǂ5fJT+$}sc''[GD0%i:՜Ҁ \|hEvsYܭ揶F6>ۇjժXO~Q8wqW` -Y"On\RBx? ZcyϏ/`.rԶcE?eYChb~n

שH2[J)ORʣ/|RJQ}N@K-,qghH] Z gX2Y|aUNx Cx993t 4 (C lB)!J s&`с+eՔc7@H( }R]fv]Ȭ*4'!02nM< y A>N)'-UԞIJ&XV UaɃCajHu4Pڗ4SMg;`c)jKa@JS}N5|ҁKNR=n;'1ѣ/"(o /KJ-n~cV/呯EX*%BSx@M8`<6zx0vr% k$U͟+L~L TS FU* ]Jdkl*|Dzk;n֮f-Xu.'[s/[sZPM-1*ثfg4^xi"ڱ6B,zun-AP_@1@W AH6X˯2r EB~#H FoVDN`eצBA4i㍚G11ro0-lr%AxA5)]T*(IjJRZg\k-r/ySٸsK6a2 d@̟f>Ec|bޙ43K8b'e 9KHY"K^VK!2 r}FVP^T!%A T)B l7f0MJN:TߩRxJJv| qԦM2 >sBctex0{f6.Z}X8OjkT`9mg AbYA%5HRzCyi8gO ovg1'5 {sPܲ٤BA/I y> H1Ѵ팇muvAL{{1k1]ڗ39+Kd@jIHb:ՒO @mK@#,FlM>o5u{{ ӝt;MdR\k?8zG:]0L"YTkMGNmsKp\bFU$owBsrzs(eݳhUv\>MÇpïv뉪1ժ+r{⊜nOVt;$\Lp sȉ&TcRUJGq T0:UJdh(֯UՑLQy"$ϲ>ϵLJ0Sby|Ak7~Z_bZ6iXh݉p865d$؉nI 5v=%&)4]ZTtꘔh͖ʡWui*X5ڻhڼ4?b=0l8l6A )AMe,('uHL8DPPRb,Fz|Dqrtq'߫Џ)EGfG8hd8k'8jaI WJ!4f mJ>Rj'9jB>6AYaA(9Lȼe9e+U-Ғfԓ@שT Vh8G鸯8GJ [sTgZ۸_a˭ݽPnUvRqqKT FbLQ|8Ɛ/3iӕ#j8hh4ЍY߼~Xaa )&/@Y,"u܌_sp@wUoN٨_~O8@ƉW̚vrEAKɗlbQ|Gw t_z_D&pc%)69ov40]ҿ(>}=AHziv##=C}\cN%W.%0TfrRNK*;K5^+sgBYO-7/;߶$9cV媼*7nURbV-; Qb?ɻ"4!)i/./>>>ڋz?ʥ^:r*Z<\D!J`p@"?j-#D Qqb $e'qT1{wF28P&;5vQMQ̗rƔ tdthecQLwGD^*oQ' &;BMbJmeSL=ЊM8i c ։Hv֨?UʹmAD*~lpBcc3O~͞6N&H_ ,JGWМ(!2yRYeTyWoޛ>{;83Adr\ aFNF,qшz6$#F#V;i8.f9O$Cw&<"<ř@Gγ\-d[cї4u?@΍9Y!y9Y3[_d㘻Wmye]N4!zc6kv+&9=ɜ q(ÔHM޹.sCQg̜]8ZZ+It\w@sE0mA}Q?x6L欕q^nB +B}:9CMk9E`M0kԚ9oJ.E]isZ2N_PsܦuZ%gYrPER'[Bdk ] |lNR@k^9zK6 FYt݆vHJÝRUo^+^who^@Hx$!\z"u#N ,F FsLMbrH.8Q92e@(ֳ\yG0Wg^6-5X}4ke4/Ui\yo4 ƞ:grdL)Cq98ΙTK24[LsOi 'Q+N2)ҌR prYjmRSÌ)2ӀEվi4J+(8qHDL^@玦$78bLXx _4)<㍯6n|+/y/(d [,lg 4㇧\ZH%EoțO\/y5u~;hb ??) o18[<, tUW߮5Q>ޛ7{v9uٝ{\p.?ޭ/+J 6lB{PUyh{'t%>~ ̴2k#CfjeCʹZV[<.n K!ln`UHXuX4 ӭ2Ama]P}aZJ=FBX !,A{ ۆUbМ V%&(n:DZ_BN8!9ӜqRX,$AJiu4 O$xo<72 K3Ce Tis U)Õf 11XY PTDl[Hu&\X%a:ҍ}B]XO,bEDX#8)H!"IMRj\F:MSl"!& FFh(#tH#"_!GSk˧,[=u5?5sY篋{w^]nCL(H4R/\p qX`^bENr3*2<>C.ʚenxDd0y :kZo 8\ i54Bޘ:toH+qKEkJ ;/:ŌAPy1$tk:ϣ"XgLb?YꦇPTNMx7tсR )u(Oص?R]%}-5MțM/<|~| =Etz}1}N鹿t=߇Ѐ7}X3pV//rq]!6]gnV|>l~4ӂ純%_. E17rq Y]~R{^”n^r)kHCr)Qr5u޲n`y -Չmu۶n񙀁֭ U4F(QGqn[7U*٣u Dubۨb\Eeg-XDukBCr)S:!r/}V9=czpC^GKj{40DQ!EE|b1?v>~n||N72DՋm7 5w0.'kiojw?!/ݿ.MX6x;f6NZO]ئd30R)߅%/0Qvlz'tB?L?`慘3Nۯd>p:PK6XQ-˨VCb2R <Dž3'm+t/\KN}G jss~T&=m|ti|1Tu "joQxJrZ%s<ҺH:~`Kq/1U\[.4x'*q%b"=ViEz`@Kj*-e#J i0HdHc6dY1r !tm\ݡh4Pbrn_OEN`;f4i3,:GBKqbJI1am/2aVk LnX.%8<6[Lj.nLS_Ac#8ͷ`T0 ©aԢ;ߨU+(0Nu8# ݮheV)ackRThkTaKc*c"X`$U(FEF_>%&RH(`+`=Zp7ʙQ 0à n`40 n#嚅WpDJPȎi9VRw  YzXQz@ ]JT=.픟kpegZEʃ!(-U ^3Ec XNX:a,pdaATO='sW۴$l=%ZSE#VB4Հ /6S=Y*l =RaIpӝ[JU`ָ+/Tw,T \ /_(TEp[UހbwkpR}E)ם /aY0KY5AaqזZ &U͚aRc^" XS6y)%f \Rk"7RӲ.Y}"mT>џUNDZ} U4FFG?SoY7QT Dubۨbz4Âvf[hN|uhb1Q6X>W\۶nG[N&2 EÕ¶?W9, ;+s ZI '<%ӂә'~ 3vr?.-6@~x긙CwOz!,. WUs[+75ATR;̅ȇ|*.Rۗ(6)x3U!흊i+ aG$vsFh:4Yn#J+^R;OT&A Z[j";IoYIZ( P)N127ٛiԚi* UXj@ZL@& yjz*fg؀ 1˅ECXQp I3Sg3)[M (UV)3SexrXdISD iÕY4&RJmĔ+\c!P1A#HUU/&6"A$*Vc)Oh0JUSY7*B )I4\z"8HK2~E g<͜8+CUʞ{|#{*<ܧКa 2vM»Gtijolr7Axi!J F@H#;gCZw3ʒM$3)5z3vtxۏiLop&-^i sWGZ=IU4%SHZvQ`AKuܫ"{i:u!1 Ml(z%Aq-1oǣC)Taz8_!z 88X'y: 6#RI }IJRLύ %r@bL})!CjB4ip:[V&"{)2?y}dwpJΣ0}ϱ6oX9R-2Ek$Zr}Djl uڗgi_-/?@@< j"""8騣Z+Fi7Abɹ QR-Ǚ i0 y) AnuQ1BX48Ԅ=D[[G *tu-4!I+-(023uCiݖ Jlۨ꡾5%̖=a%pV qK82w;)ijq\~9|/)ln]4\MR頦&~%eg`;(9'Ro: `ޠt$w2҂7 ×2U4[8aAظX?L[c62{=].j\ ?xeJ‹(%߾{_ȇ(dxۇՏ`%+.ʎ3Ƿx0h6; D=|wJrƸV\ɍ͟ xT4_7.,;9_]yɧ&_Oy 3Bt!Lˋ>XVR7W̯m%mn&sw?-lg2bwweȌX]Vvxo=x|eZjFy+5Af}|eVqX!J%A _Yk ? grF6B}3+Mx]K+e IHX>kaDg0"ĺΤxk5fl;jSC S$8#`_AxyD $ְc0/lE]r6S o775Ye83%NΞ]v0J{qn'ɧxf|\VbsO4Ar,~1K"lJ:]9}yj5TiWjm pQbcC[Z%}n5vB: 9FCL I ,a(%!o@Q9#c/SD%S +vHjP=,>Ih`y2"d5Rn. ׸Һ>R=U2֞ a0K),xjQZ<}Z+smq]ag%|blR)sB?#]J(DcE9 v {f7h "l * J"LY{/p}`WqV 0ltVM) 3P0i &}, s=~欟Ƿ9H8Q w-:P`0`np- X-eXixP^+tз 4 +s{7+f=` *(ZWm=> Xh9, HF06ns`)%-'ʃx7wƇ}hk%`);XRU$,ā,,\z%78:Ǭv mYm93J7sxkn&LPBK(> Mn41ۤeϤ?kgeO<?Yms4ьg+~ޙъ}wo'3Y8EO?UXJ &YRcѶ΅2{1jKU%ۆ`bsI/"kr{^>9UeR;pTJxɨq3_7!f,5FCwX{6g qhj2jtZ7%eR5Sk 6gN  |)CH5tXYۊI J+)`%%#{l1Bl`91wwӃ(8Ky>dbΒjj:m5yt}/1Y>kyOh, @4pxТۀx҈i U4kNpN9 I*րO*Fs{W\)ak]kR۽zQH}AL(7i_j^V{{Ouoo`@L-=3%7ʚ`>=Ko~Qkx=~;W2O9VzdKQ{޹-Ύ䒈s*_=ϩܑs ܊mP5IMGkܨv1z)/!URf~?L8Ɇ h6 ^k]mUD!Шv &MS)D]<}^]}*AZtsӱKڋ][gaD BT' {9Ltw3uUlPT7|OʡX x,  "qٵc-?RfuK14ZHDV.\jI8W۪!b;R%QzsKS=Q߼{߾\a I~QZ^9R٧T9|>iK~lv3,QC%!y]+J%s;ÓS*1EF [V'H FCP=0 l+kRP4ax0-TumdƛVq!*`6f c 7W<}{s? Φb-śFYS=8KvEI?,6?t. r.wz!h2Eo$KvLSȭd' 3O/wn3'ࢡg-,N#uZ(~zBTq)0.(n1G(F"(GAޥ јξ"1BFOA;K㗷/ z/4Q0yaFq1Vתt@+JvQM(t|5n* Cc[ D*iGZە(UD JOOHv-SeXȽ/\ mO+>t4*oU"jWye`:IETi=cM2Ѫw48UBuU4fv=8U༷!{';="fm[IWIODDucd TbJC4*U}-;Eu@ UA~7eC}829c]MIrFM8c+gƈOEdz%dNr;ΡƜt9_ l*р`EG xER~0~wi3WEƽquv`K7Λ&\Il%{V "%yk90r"]{G2ٔ:'턜Sҋ;'Rٶ+}ɵK$;5bRg #Ր\]*AqEJ 6>i(NRǚUͯ͵}zTբR7wo j'h 02HR*.de墫L?,V$@D #iB, 8(2A Illt j `ImIY"uT!\J%Y6x)WVux`\elwP Lz={;>b^M,V1qa/ *&~fo0Ybʼnhr~zTa[fl[g+bv= 2'uNrto u\BNwcCkBTiрd[5+ 'X$#%,bOIؗ{^OOZD@IMj 7jo +S0y G .u^ǁi4h Ĵ:oZAt"dARlgI3 UDg@h&e V !P' [erj 5σ:!v`9,N t (_t6{&9'펤PyjxPtH_O./.m 䫂pW@`%g#5R l%:%pq}.Y*aYe-sNa4Y@\Zߦ%E0AQve}Tx]m\w_GaN,׋)tާ~7`YVF]/츷WU8~,pnz=ϒS&i "s*!w]vX5ӓ[E1=ƚ\WH0(xĩt^G3-ίgN:/;\ċ<x*7$2gG9'D8еf*nXyvW5nHuOϗ))Z>\LU6)eKnǗ!:h-K?VJ !RŒK_3,O yTN^2HM;i7'ڷ9: gv$Bb`Zk,~xET{~U4N~'/L6XJ#6v]n3`Uf3^y?GWWI/fKjd?T[kb̧/3;DWK>+\x~k2unӘu#dӧW0gϖN 2XѦݭ,jkK1 t7m%h¶(Lf]擧saLxMOՂJőFII4ڴY B.jjukb?_asI0;*7 HIx)NݹXfGC F*Ǫ* aB ]/- K&.~ ATVW(/xLRPȑ2\ k yψ#@SI~3UGT %:%/ĀV * &Ъz>QXw﷘ wjԻ޿d1\*~\}l0XDN<\JJ!$I"jl&APDJx%!NFnbLHq N\K-HJy*<U3D8AyZVaɧ y|I8p }8HS)É q1&F%׆`g,0 & %{=lPN֝JZZgL=[IJT#$㚭}UHЊ?p|P\ˎ*KI7SGQwYJe(=SpB#gRo:U><+50s俜/d#@ m ? ꯗO7C skqg81D)x2M︍ hW Kf/ ;eyɒ<2~,|}%oIn(L q(۴ƛ4Df/x=j' b]Nc^`|6dd ST8c亀9.:ϝN4S`alg^<2ӼJT F8[U@g=]f2)E7lc䷳W3Cyklv'TߋW1޹(XYd-TSmoZ0nCu]`ekx>엏A}e,~QkOg`;*MkGC_\χ/eZ$\2$4rIXp+=WTE%5. iO~ъsr_m2HSOr|~UyfXj__Uo :_j*%%<-$N(IA?ĩ̙$"S'-'ٽgQHB'J1M{"jl]FXR"Ijj'1+|T)fZiCA͛R"D;~vwgC*2l?a>״kwr,rUR n77-0S](YQBXmF KB#0JV >p]H4m:\.<_~k.vUB9GFk'*$ѼHBX rʱBv]25خp  r!1=Ǡ>ۆZ`nopCi)A} ܸ(oNJ;\r4 C} 57(o[J*R*Tq϶$yRZVAp%Ww-ĤۋU#/𐒀%Y ,#Â5xdԆ7B(U/ Qe|j}(^TPP옑~m,!>_?%V^j ta]Aj%Vx4_gkdX22_WV%6zWf2]Ҩ=k5IД>H̍ReB 'C:D`Nj5OEATQD-є)n^V% %p2]D"#Ȯ i.,ÀX$O}o.$@(B$X5Hi|`+ŷtc Ѐ6i ’RH fi@m}ɛ\G%/G)kP[F4@\ڀ%r7 '#p&A8t1P W(ī{>cߌ%2ǐ L_UZ+=uJ&nK5?(/]_mw|թl;X,BkD6Bsg-2B MȌ&U.:p J(rZI%8m <^.& w Jq%v@1dH+$!JL~5n37ҀQ.2}r=h(yjhRYGUpdֈxYHD[3x vSJRJ ak;t^DŌ cAoPq?BXI)(5h:wkp;NAο $CIHMP_ AE;{ ,OHSănm1B lYmrBp\LrbU_-o4LC )DtSL,!vj|4RDV7 \I*K2e"!8A]ROE~A7 Gr6Eq4@X" `,eȺ %KvFumCA1>ǹK^DQ<@H_C}GLbSrIс&$CHno@-N޻rRt34aqr=rI Q Td`jP5 b]4اASRR=<ĉ*NGX`_:'A1Gj3EPʼa-x|-߱¯ i)  1`3*m)d"]g o(w3n9䏌@t #FbʁU \ ͼ6eTTɋ[ڬ+x CLb g %8X"v~Q!Bt`Ql(AW9:b3P%{{ lEtժNcFW&%W#ebkmHnvžU_! @8/' ,SMRu*ln]՗Т.lOLGFC5TW灣&SLcx%6(OыF6JҖ8-Em >Ga #а9+ޖAH4ӸU r}t(uUtr.%سMr$˲XC_E6EI's̸lxr!DH\[xf] ,Z7?D:hDǸIYV9%!b.Ԍ8qE4^m7S{Oڤx)IBUPb #A qAS=4N:Ca q`}Bˑc!8ǦҼ9Z1ƍx B8 $|$8=lMzJZR s+wb9jNr⫼]Q_JQhBk7~EȼɌ|F=)CQIRXM'ȀSyȶ5Sңa0)'آz E޿5~-[n j8+7LDyHKh`]hIT $Oٜj ptc\Sq(13X 4W2bm. ᨂBCY mCڶ7/C@2Vw(:!* 8(p&QS\( #KNΧp9Ƙ"LRDspeCn˦،#8%؜u$?lCJ#N QC I! Cmv=iTvUwG]鄬]ԇ fY,GzH-P\Km*t٧禴QԶ~ZkVyJŜ [(qI&9L_צ41OfxY!Jg$4ڒh< H$'[L׼E)kD8XNpK*-FP @gɀЈzFgv&ĶڬvBXaS[i3x H1r2Dbdr%1!bH}S+;խtҦ"2#d8Zoawz.RZ isI>? XnOŐD94YYAkǐ(Ck %B֞4jв #F1A[L7G6`'TkKͬb_l+]OU dÁ騙@(t> [#e7QNQlZ;Tӹ&;"`U*y[Q*;U]L5&J"(k-x^w4^7==\Ѱh^%gc+E"Ke ;1Q8"*(d]k5HU导kԧG ZAʚRk=9x.iˡI!\JjbCFm14d7 9J4J0 AnQo/DWs~O\Y>W. *R$/~\.prrԛMUHtlFFܧ0J䦌W('rlo __{{N#`Jl.._36QcGo[ܯXa)*)mTeebÖG8)Ai4lx6.~Áo'‘OO -; ;jQo1 R*U{z7áN#Wd`;~Si/W:5VQ=KBiEC>#|rp8r.]SlM'?zz>VeH L.pv[).jx:EaoO7~ϗo"tiīWտ{r!(>CrV'sngfKpi2){Dh. Ƚ[Go9N뼥{f&9fKɹvom|t@l[&\&# !fu1T9)>If/eE qK;ыju6t;bۋ m /gX[J8x-O{t/=F3/ebj74o7/RWFVC_\b5c*(8E pۡ Aϧ?~}/]]m2b=Z2~_?M>syr[7w ,e "R=<+pF xIȖd9[myMcH V,pdɔ6ޤhw2EЉR8jB[mݝͻXLes> +˕f".^\fV]-77ZYpf.^WiӯËx1g(/gRȶ!'z}}j6_b4^ᯌ,^o붙)xϗ>6H2˅Է{)4FrƁ>y8S:>ҩh2yKbs(bꋍn`+2.Wtj):KILh{?_:vjdMcU>pY9CD-S<>~|qh+{&ݾ6}$Dg"ɷ< \u28~[p[_܏nA "T$f* 6]["i 8M}x*fr;m'?vA(~hj;}˛L&r7CN Vqu?E+zj1Ms8:rRN-7=~,Z235 \w/ }e:=/&xslִݜ<.kL`9Q@jckm8-ƿj239oLR9yR3+X ݸBſVJ#hr0'to(-?J0/J4aeƑj ހ4ɀw=,kף>qq0* hLa#,SCWe!23|^@gF}`b+QK1R*RIn Q|WfU.aF* h!%#RR4LV.1tNU/D${Kr*4ȢgPۂ)W&a`^/n)Alz5pRDL'>?+x6Z'ZKtSc%Q ʘ$@m@<tGD 'gJzH_|wH*H}h wjU$e5>V(JUH}e5Y_Dd,ė,rw+.Sd''z\PMKP0>ͶI>8O5ϭ@ZJr(с%&ٔ1qf\ƙf³DDf^%FTU7Щre*}vJ+ibx2m.1Av)u4|*RV^Ƃ+ۙg;Κk pK +c Uj2 Je =e}$[xa1"Ԉfe+p MȑƱ`Bu Df$h9y"cr2ӿ27IG*%NW2XR;Q"PPBJ io R-+)qZdd(`5,ϛ_{aR-r]NOOrͺRc)'i$#>Cw"Ms+@ W@]6žMN~~V/iSQu߲n_G ]-յx8Vܨ,ZK[1r+myQJ8nj\~4 |]XLꇫTB ^mLI%IHs]ߧEVfNJƒ‘%rui|/ ]WNf3 cR|¬ČTR M I)X6iu}`Mh\sc}LeH %!YfbFyg3-ii#шd_Y,KJgǹh,Q \q -f(}Pv^j_pxvuѺ~oqV24ޖV<0!xE*yeZeFH 2>7ʼn'_827~N╗Vt,cEXEܢc[FwzecEuoAkHތEj-= Z#I'#<;MH/]N'糿Y:EEIdEf2y0\2h)`~S)5yҦ'RU>9w4;ӁUxP !AzyؾA+=WV=jUw6Zbi@ &∝Ф;U W3R΃&kzu7H7`_V{嫕^[G͖[$G$Eɥh<7BP@:ɕq(P%L=TNĄDD4"KWOPF\Έh$؞E XPͯWr`E$[9悧O׮C#ԮK׎Z ѡԏG4xﲇS?-li|2q&:z~_:ܝ:SGek9*y)K |r?h`_>OM/.N?ݿaC.:/fzqƉ?XfT*g2}D4# \X b̓tȘ{Ww@iOޣ:YOC4 7䈝}:^cjk)Ѯ+jm6_ڭ&w_|#fk(rmW'5nhmhr& I&Oܰϴ T73>cCLGm>{Bן^cnUFM iJϭ]XDJRm'W]؎ I֚T8Z֒tB˅{!F 7oֿ|jۿ6OG"L@g]=GzG-0ڽnԽw{jxIpZh[pc:l׏fϿ/Mi+$|bLNn6eB7;7{X8|_~?e2idztM>ytФl yq~ۻB{,O翢lh:J:ueJyp}'tgJk8WCL{@gqG ;B>)'a#f7ĻA. s/)yG$[z!,䃛hrl|Ȼ!pWmL;x",ڻһ a!D_mSRkPn`rvϑyRrNaY\7n)Mώ3;1k5[ɷ˫1ms!7*s5܀rÖt$-YFWR[Q2 Vۨ1l7UjgAz ҢiGJ)d Oܪ |A(;V:V JqaG4eA/|/JyբkGkXGgi&!"P <犩hyQzgZvj!߶PDJȃ&[Ef'p<9JZÄg+0*b2pˣR<~qDؖYxl}J pV ņ|1m$ r5>S2B[Ldd7ur/yI*[&8V6tzDX)i4rަÃt<*`OI 5r<0ԚBd WBB%'_KR`H,TŒEJ>󩿺&N]Z:!zn*K0\61ojYinjY#fN gC%7J-Z%,=\KB IW[`7%CZUrhn(V5S v?ẑPLdc3=4w,L6x /XIlR_ wK[@ka+Re`d0GZTXZ7Yz&R_B/GpjH,xDjwN^g@J244!ƒJiu.!o^+Kq>Dv:VҐK/\™ ^RX@q JrA 1;lxvY4R0R ;AP 9rA<AˢW( @yB'J߭ tKAPI`]08nD{Ahn=Ô+ 1OfG(DL^N*wX&nFcv2/^TWFn9iBFkZ5#.R[o}J +Mk+);M=T5e mM6z:,i+7@֬|"zhU}MPزqmu;h]pYL  YDPe ˵q˞mގ$ ԰n5V?/i_ūYjAeٓ~ULe:%/Np)ԫtӛHcզ}muq9.-j,R٫*Zfm *'k/<׺ 62XjI3uTݶJ4F@sV"rĈ%:Cޢ@|nZBu#ιZ6Imp;&z~_G bwfoȖ"弋j#TvV:6"lg[i-PY׏$!%6=vQu.$!.jL9Qʍ檥 !M\d[EEq;QpZCKI^p(sA2Nƀ ܢ( Ogݲ|dkG|o6uv|4~YpȜ\;%?f3eV:u=wcx?*?M:9qI>OEn?_y!ji|k,@R-4[daJGר!!48&74,ʽR7ͼEa~7) {^5h|]K+=w/Nb0y=T0a3oϗ86ɤHId~z&Q_/?sDΓ?z{:Qvӿ笃: uTM].(i?<=~Ugʚ#_˄w\Cf±qGCA@wU:|&Ȓĺ$`nXDD"Χdvn?߂_N hV6f<'c)$Q q_Tr~q~ y]-n!D6ٲgU̩ʣV(Xt`(MabO0 l%7RH)8Nh!ĉ7Ԛ+4_zA ?܎+–oZ)u!xǨiH, ΐQz( -p8A;('YU"$~b*}x޽2r@QEq~Z+Z-@!A/ꋒ[no.|ԋt-hat:'˯fD)QAI~mT 4)PJy d%THȪ\9K9U\QS[: C`݃6_'e&ň<`;LӣrUZ _@osI2._wwմ%1^Tojn|Yo/|1~3P%r͍X}oLho>_}TU䗞n.u V>awJ hj:w{ AĈ}I8]3:|i>};x3b@/P ?G3za7/goKXG `ggXeF@j7Ot08a&ٰ|5^]9ОvKvMӮN^4[-oW V|b $LYcGg<ujd:‹akY(la[;E-|1;ZWJ>Ո4n+wW\u"5& +zizѾ_Ot'հ; #vLZH.$d<8?OY&yI vHh+pcU{o]4|'뽡A0*N\A~aVx0 R"TH!8o }ooQ$]-[kjNQO<ٞ[P(s$nurnӽsq?qn9*|'G5Xi묶mγh:+ kn`w֟`tS8 Ӌ~?4ppo?:&M$S% qKQXhà}'N c_n"`ʖMޚjl*QmO5Q#n܌po<1KѲTrT{5k&Yu]mS)"B}_#7}ZU[h~k?fsot9l7A0𳙭nJ0xi*._ݗ"~:FMЅy9W`-ѧFg*pOu~}H=Zn.kFnw;_4 T:R֑rM)(&;=zG ["A8Fv;[[r.`v@B^&ɔă8wzt 햊A褎Ny޴[@օrݒ)(S<#`P{P׵/+8}ucs/=l~P@d"T8_]W yzl73y@&ܾnCnΧ1b,v*_]+\)a a"wLLi3=li$)-U!fJTmLp 5*%;l x[~~3"sa=| L$CR*]L7"sB|$crDy / \Ɛw?Y!Xbآ`ϠD\s>s.NW(n vA:0K`V"5ewx~ZI Sd8_2RfYO/׀ԑɽ q<tHF3C #UU;Lfn<5\s).D"fQ(SRX͍*#$87y&۩2_t2%?;ιZ3pLR8,FNr . _ "A8y@ hj )qN*FQ)K$-UQjWb !bXD nTkU$·-eHbO'~x5FA zs1R V;WeFmEZ^:.Q)tLS&7"jbs- b .M 0bLE_\`pȖ"n, %BUQ"n-T!Υ)d4! -ÊUJ+W>AAP̔"[I4쌊y+o(v/{_ V^z,v;P$iog@g@35?ЈŽ1Bǹx*:x <\#%C C0TX|TX/n{L4aZ!Tjc Ż儓UK # f{wwO!jĩjaa;պL=U V3ŲI[Re}ϣ :J!`"d)?f)߁. ]iy߃zAb2bLHܨh2SEaq*2#N/\\uR֎7U|j5b!/fFS7#X_[\秳ogDqaKO===#S_a ïyqc=U4̗?Ua6|g"|2?ZMuKX1 F4tB|ʶ0Mpƥ@'{%ZS-ЬeA`T9\Xkĝ  2Yh`,I"OI. b%TKe(.eaC*qj1YL촥VrsBJZ8{K[yT4'U+N>=lQb0l׫-) ?]Htz5Uݓ8!19|>xY4'(?Śr QfrظucNp!* lFOzj y2p;省ijk1/\#<ֲrpO8 '6ud bԁ1^$nAĢ}w5[FHMR6lѫX žMFF9Q*o,JgR +#&߫B!oMK*4M>:*Oi8#rƘ k#.}*^0g1rw;4%|k#9=tn[n.n^xu bT-F؁7uu@n?onM b╣#*5i}V֤}&",q+Cd Sw]ۻ.c}Rb^x$~5+o+/u(zU1mwi~ oJyP8rOP nmHT"#Aj$9}[?ZE50{hwwkf,Yhɖ󜜡?Fc `YdGey<D#4Ug bZ.p4Be)ʐ|!1^K4RẺjgR?" D#t!hA`q#\;s%b!]҄Jh" -Z:Q328 _uAmf-#+Jm[{j*-o`(2;L)y%㞇r|7+EH@NY8 tM2>[ (uzS)'77#\|Ow` WdayUDr W%yvi׶F+^i t Y^u5Pcȟ(D[oX$ 5Uwi "㏳U^Vx~:NI Li}:'.kO " U.`5\ S=[JT6ZCxv!~~"BvTQ$n.J6 G.3)B`,(HzPv1xψgЌF[t 4V?9e#*0#)=W#ߑ2'3OґN5铵8Xgv\<ɏ\ Aw&4{j~T~-_uK>wc.4GXqҨX ?A:VttȾ|]OM:(/TcJzBҽI.=:ǚ9zed#[0ի֏W$If{nHX-ۉ;ʱgV}lG!ׯ9>ٙvUݺ}CPs6ЬLn V]|3N ' 9{3BhjO|It<4E.?O.czJ^:zhn2濽>).Ӑ -ff'=jE{F]zquGZ |wDd՛OBKP\@K|>?grUlL͊԰mfzAnz9 d6 gzq"mt&! A1z}Oe-\9DGájJ 3E/(_#B,iOEA,-!:)% ,|юkԭ4>, }0hfպv+J~f0zt `[)CW{Av"i#+A,6*Ryið˖Ye/K `C\JaF''6̢^[3n?7Dɤd Wi*aV6A 8G"Cm}Xk> QG<ܠ՜UOLp(>_< bZkIw$[~'l޹[_D(#B+3wN!;=/eti60eBhƖ :NPi)"B$s!m`66uot({]sa Z[gx\|F.C (ge  3z`;7L6k>xnhT J9)KO>M~xGo< 4Vb#> U9,4kN @`B`)bN@L8 kSD Sf @O C3!LdZlp8W3@CUIZ‚"fѬ􂾔J8*"on|BDя ?Ya"ኡ]lƌ>/}][H'Y,ҁ/|dpݱ2Н!L !G̢٘Qg!k=?@ )Yg ŗy&ĮTMۯ<)<9Gl=#D&*eQ͸*i#jz{z!~+4F!Q HȦt*0>E@4N:*< 8*HTЎ |T1ř QdȯE =fp\@L5C7!xA̿~_&ǘLZOL[t]i9:j>ʻ HD2f ڝ!Ba)iDQcDa>O3= i?9yt='b4&0{h!t'iF+rŮ>Lhe/'%5\q=OXQʅǑJK/)*bMGhށN< QeGݞܗ6S;qE c@T5}B. Z6m%%rXbHr1'%/C r \(JtB/YG\w5Dh&mF:#]6hf]\h>?\߉GB :8f3ZB㧰^jRŒ~5#"@qrއO}Ѣ2{Q'uu|˰c;ܓƅ9NA^ hktVӬ u-cpP@@ -~A*ڄaUAkx͏{h#0WĬuµ~>+$ ƵO^I>0c@NЕ!9'bړ)|FE5\L kQo>04LK,CXh&E)Ω]?:8P+6t`ףm&unK* ȈGR;&Vo>CO_JzqJ\%3۷_] o_u慒{ )w@5|iAm$H4_OoS0)WMC"o\q=S!!PpUXN/W¸J2 tpAt&ȣ 1S6A&-DF~Aa[k]WSx5Q@\}f%WhMY^);s #!=Gl>۳PW隽8ALV;+gSg#_+5D/E[~oVBqKhDLP8? O}EI.SOu qߨǧU5%DtQ {3O ƅ#qᑖ?ײJPjAyŋ!BI0Zzq 0Ԯ!gb&Uk]6ך`KZ"];o17 f/2vn8θ^'}e۰7IFXT7P);L3xl#ɭ9:ngxyTRs"3r/:+igc_^+<~O> ELtuqvdDVSbA*kfzlYW߯KӛH|H;uwƖ˟~Bj (ާ'xRRRF/~Tk1cI&L0vũ[uŐL^۞$&= 6qS-϶zAvz^ًZNE\,yILԔO9+JUTcgߺ+M@~j\_:.xŴׅU }͛w SZ}ݬ*vB4쑹^ MZMG8ַiB#iBG58:|Oа Pq>uUK}][P'F/JaH,Q|U(ɏF^y#J4I$/ ˘eweS!Pٻ @v-\#ĥM_͐J4&Q)wEQ"$\U<<{]qEx8FAx%ٻ$ccuNb}U w ؤyo;4g wW)и/vISHX[: Mȏ` Fݱ2 BՎ_ڝ!-u]>#u*ntk_.v-t{*<dӨ{DJxlY;A,SGSoh6c2o;͂C촮ZJ si1EyՋw ;ox⓹8?ҵUgyՐ1XDULgN85N7dF68ޯ2+Ӫ^5~e|3,ل#"F;]{Ҥse g͢&Ts:a=2gG[u.Mzd3ʜ٢^kӪ^u/.\:(ڻ#g8^o9 L 0Qh[ĭ͉I\0 iCZ@d}MX͡fmϾNo- 5u Ho&d>ߙ L|6H/x&C_-b^L デT.FUz7 ?'>ZoN,ڐLOe{w\RŕLJ$)/OC7o)Aݤr]ɍu5~=-;~*= )<=gXwKK"P|yц%FU)O&R\d$^<#ӭ0z7PSТz7Σ|:f-d*sID$Q$YɶB/iވPG خEޝqDSag{c˶f<Ƃs4s5>: d4$3*l?ÑA礟: `OgdnS᷆`:Oy:`Im[cfbw̌}tT(_FQ]7?ƞĎ۲*:Q7s$/`HahBMSc6ϲϙ@cyC(6IԧL9$C΢/dz7ؕrRX{X;;Tqqlj,/ץvAAf|$&ýE{WyoO7,6GJ [/aX-g*Y F.$uM1&^mrF=WחǃxI i1"LX}SCWE=>]Đ 9Y[zƞ({dMD Tw ȑ(V`*Vg$t(I24_墌ۓ_XFUQ Oٻ>ix)+_Oy^6(3`%Qv}MF(v)fD}ٖ9fkaoק}p$ʣ$/ ;5 {km >m窬w2*VrU>o㚙>Ǽ?<.`^ڛ%_\)Gr)_v( &7k5PNI"1f.!T[_YGΓDT }H4k߿j~4/[o0XqA*~ _wn4W T:pOjQUϫ _۟[s[;K=$^.VϞ" m[c*)Q cU*9nڗ_޷qvy_YذWV1,ќ:6W% )/dAUybavz[a/NΕN6Nl-A!,p;p1*RJ"Zgom8c tzxTAMßQ\v//:<-Uǧ*Qʡ@B-6"zō@ d2/{SHo%(`u^nΖ#֠4VU* rF9fk/xYqm[Aw Zc7>g]`7 x]QvHözˮ9X=X=LD-AT|) n}8jA h 8DžÑ ']$'~>ZsU[S!DEذ h-o:Gr~-DŦU{jo?C#}G&),_.t:Կ$JYX9'VakA ڞY]i`'xӛB|G]Wn\. ~[OU-'e>XW(cNfoy٨WJ!/CЊ(tP %x?&ܩ"QR6{(g<3b:nȁSɔث rKVc1Кn̺٫ ٍ5':MzYؚX!\&ϫs~r.\7)cr<z1u^.Jjg7F$Ջ;VZXRoSL0q@>iL tEVM@7M+mPT)إP&u&xT1jҢk՟p(Yqvt0ޟ,/fbBÂZO}MϭPQZ[kݒh:D+G7%L݋og7= Oc0C|B39B>hGeuG*[@38QݡG14XW!qfE&;w0='< f0+CwȨQ%MaDevE"+ϔ}Rޟ ^;gzx|,hie׳168 YuDv$r{muʎYtB{G4`j% `T#8պ @HcS_q6!nj˛ n4*_TK5/#_#Bb0ILpF>h n 1x\7n >q\jzkaV}'q:XX\)T6$˶'mH6o#N߇^ &d&_vŖ3,_5)۴)R59LQd]]GWuPY1x08 qS-p7 \ID}X3zXq"! (DD*I:- f8!6H ͩRՑhߡg^  \Cij&pLi; ILo>FWO2ѦI_Y18՛Wq(t, q!6JDBB'HQ}Mi>cAbjN5 VR0"\&V W &'c(X`jVNjRpu(a,TD[ma\j̕oUw\_+MA@QT[tZd3LPZaDBHaTw],wiQ+F(CWϺ >JąAh^3̆=yO;rL! 0 30ZeDʕIc9? 7U I:"|) D ,X}R]9J0Vxwm\d׿?lJy#X;:q=wΐE[ (>tj݆2 ˅8dTh 5ut7 cX[Y a%aUTOW(e& ҋaBPa.UY]më6t]|mhdKfGWcQNҹG&xgUSf.{e ab]#s<$9pK\\ }{+QH=Ǝs1KT\\ c*]:*".03RN|~ab7$7ꠖf1w0wffzH` w 00?rg`?ٻU F$Q4!a, t1Þ>M3wzd d$eB8Cgrj)ɧ9`Do%`hrʍ*럁ˁo..x>o9F\5eX5)UٻբLV0zXW r7ᯰ'JiqЁb:2@R4CxMJv+soe~~3`?nA9v~Vx~<#Ǹ0LKa"G Ԍ\:#W[mkRG. =kx耈= \R)eݛo^RAMFp0һW4uZlAaSM`ZlxW̼obw1 lYTsfYF{M}7ıi_h`` zm5>Se)XeʿNE=2_}<4Z%'lMƽ;Q~++TF-Ԏ8s2չ!KvP'.TŠ4ÅZOН<d~xp`td?͉N6ηŅ|Wg0la912ݍ Ҕ 3 08mJDJ:Ͳ~1;jSc9vO1d*R|8OY|-!f\"a)C|@$gȐE=pJCBEfKn|&<88K&c36D M%%H,CWbD~䳉-[uo  Wfv1=G̮?7UCJc*J#-Î1f2UɿDk Z= Ȯn̏P۽ºP$[ZRg~'|Ӟ_ˋ4km/w}3(oOנ̆lawon@!Npĥ\a:{7f:/{y*ahp~0E7^h {X}pVGG^B< <(etY",`]!^XE*P1[`} M[2rnz@#fxo+)Ey%N٫W}79dky _2U_b] R۵"tѻmq;7Ʒ'-GD㝑.:$S⭃di _{Ofr14ZSkH%%~xwyTF Fq7]~u{s307`W<ۡYE! 9r%S wNh7vKѩ*mW੦i5ڭ 9r#Su\nt>[]ĈNwTn"H9ٵv&R5!!G.݊bWA^xg~O=ō6XU 8L_-^]ty«Ż'ţ$o+)D2BQt҃RL|砾\#}ҥ-DI)Opx'8)͡LⓔHOqPM:t"͔tlC Ȥ)Kr9^sM>BuM>Gqs`ʻ8-otƍJ$HChz?'a2D3f,N2J0 0s&ZfHY3fM.=ס!BҖ'5PuݜFH&bsU<#HFi34%tݟN,R%I1p-K< >$nLろ4 x$FKND1wCH+l"32sr:Ly)Yi!QE#SH3'ʵtg4 ,O(  -:o.:Dr:p} nzAE$)4k-ٷZ͸RꕛuH^tm»HZYF׫ R5g [\[#woݭsq=+q L0o<ݪqJV:$ftwDP1 ]st{|'ä+0ߨm 4\!vrvй>YOn6LRC 4[^r^w{%V|ˎ+ !im*JmڧS3N7HbF _{+P+0d. b&x}G!sӧL6d9d 9r%S6zE1Y'[MDNwTn󻀧KVc[r"ZK4{m Ft꾣vc Hkڭ~Hք.ɔpvH2>;١'=;HHua\ ;(j*Ul`'tbЧ[1W r)/5HթJPMLSj" X64tj<ɴ^:܉ѸI3Жm${WǍW %Θ,^Ezr`GMdaD,y5v9Z"}#YdU}Ux\vB/ИuRl ;ʆ_W;R"kzhKh܆)#/ 5sUƘ>UPN}~ը:"g/T0AQ,U CĞX4!hrNP4dJABf7 dЊH족7@rwNByz !|gw]-_>ry],K8^g Bm olܣwbSD'".rT+p1Hd Gjd|Re#8vOz(`9$*Fs^ #h׀9p:M&!ls|tqB%Ք&U=+z'tX-@sub7ε=F)X e&\xOЙGrS}֋/_s.ZQ_)/C9E~^,r!܀PTt'MkMjGlZS? 3!P 4.LUX/vϹ󖃻6F$ѻRĉ~y'uFwc~ŧg( Kψqr"]E8b2b'6w˪K}%pL] JM]w4Qr;r))S!(!B [?Aӊcd2Gr|UᮌkE=AƝ6hӨЦwa)\<\J?pwk]f8A)FՑ͸sKqwH|`Q4oO2ǿzLdN$%ixHW8Z VycD)D9AQ4ҔK g'N!s٨^\(ќ5аv9 SB(.B|LiESXQ}!GF=dKRQfJ? _O- T4 %Ɠ~d֎U(B|@:0_uK4|#Q4&"gd >d-IFqL'?KAq%+vPT/1^.TB>^E$?$ [\ "@(a|Ĵ{$?t d`Kd gc=.P'B%Pޝ/W֓[DQ lidI<6-"꧐wh DLCb2Xt&ڟ;F6cE'wˑFCbi&!>$m`QNuwIO.)$ QF̓n'82HO 'OlYY#H|w=mw\4(sL,~ @eCެ~9!xteuWf,^=fnvίۺ]kpxuy:UV7eAZ޹W~O]g߾o'E;rOeZh\AܭV\SؼC7lݦ[>kwB>Qn!\EtJw'7 Nhry:Hn]L媇nPHև192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.117299 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:43336->192.168.126.11:17697: read: connection reset by peer" Jan 29 03:27:41 crc kubenswrapper[4707]: W0129 03:27:41.154908 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.155064 4707 trace.go:236] Trace[172091583]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Jan-2026 03:27:31.153) (total time: 10001ms): Jan 29 03:27:41 crc kubenswrapper[4707]: Trace[172091583]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (03:27:41.154) Jan 29 03:27:41 crc kubenswrapper[4707]: Trace[172091583]: [10.001155365s] [10.001155365s] END Jan 29 03:27:41 crc kubenswrapper[4707]: E0129 03:27:41.155098 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.162734 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.358595 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.361102 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4" exitCode=255 Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.361168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4"} Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.361420 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.362377 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.362424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.362436 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.363129 4707 scope.go:117] "RemoveContainer" containerID="3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4" Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.501712 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.501848 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.508742 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]log ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]etcd ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/generic-apiserver-start-informers ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/priority-and-fairness-filter ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/start-apiextensions-informers ok Jan 29 03:27:41 crc kubenswrapper[4707]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Jan 29 03:27:41 crc kubenswrapper[4707]: [-]poststarthook/crd-informer-synced failed: reason withheld Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/start-system-namespaces-controller ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 29 03:27:41 crc kubenswrapper[4707]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 29 03:27:41 crc kubenswrapper[4707]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 29 03:27:41 crc kubenswrapper[4707]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Jan 29 03:27:41 crc kubenswrapper[4707]: [-]poststarthook/bootstrap-controller failed: reason withheld Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/start-kube-aggregator-informers ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 29 03:27:41 crc kubenswrapper[4707]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 29 03:27:41 crc kubenswrapper[4707]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]autoregister-completion ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/apiservice-openapi-controller ok Jan 29 03:27:41 crc kubenswrapper[4707]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 29 03:27:41 crc kubenswrapper[4707]: livez check failed Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.508808 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:27:41 crc kubenswrapper[4707]: I0129 03:27:41.544300 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 21:15:23.703446996 +0000 UTC Jan 29 03:27:42 crc kubenswrapper[4707]: I0129 03:27:42.365978 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 29 03:27:42 crc kubenswrapper[4707]: I0129 03:27:42.367976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39"} Jan 29 03:27:42 crc kubenswrapper[4707]: I0129 03:27:42.368210 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 03:27:42 crc kubenswrapper[4707]: I0129 03:27:42.369459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:42 crc kubenswrapper[4707]: I0129 03:27:42.369522 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:42 crc kubenswrapper[4707]: I0129 03:27:42.369572 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:42 crc kubenswrapper[4707]: I0129 03:27:42.544859 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 19:02:40.43714582 +0000 UTC Jan 29 03:27:43 crc kubenswrapper[4707]: I0129 03:27:43.333374 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:27:43 crc kubenswrapper[4707]: I0129 03:27:43.372128 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 03:27:43 crc kubenswrapper[4707]: I0129 03:27:43.372211 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:27:43 crc kubenswrapper[4707]: I0129 03:27:43.373742 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:43 crc kubenswrapper[4707]: I0129 03:27:43.373799 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:43 crc kubenswrapper[4707]: I0129 03:27:43.373819 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:43 crc kubenswrapper[4707]: I0129 03:27:43.379902 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:27:43 crc kubenswrapper[4707]: I0129 03:27:43.545986 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 01:53:41.271442965 +0000 UTC Jan 29 03:27:44 crc kubenswrapper[4707]: I0129 03:27:44.375931 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 03:27:44 crc kubenswrapper[4707]: I0129 03:27:44.377668 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:44 crc kubenswrapper[4707]: I0129 03:27:44.377730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:44 crc kubenswrapper[4707]: I0129 03:27:44.377747 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:44 crc kubenswrapper[4707]: I0129 03:27:44.546403 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 02:38:02.634288096 +0000 UTC Jan 29 03:27:44 crc kubenswrapper[4707]: I0129 03:27:44.699469 4707 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 29 03:27:44 crc kubenswrapper[4707]: I0129 03:27:44.699682 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 29 03:27:44 crc kubenswrapper[4707]: I0129 03:27:44.771005 4707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.302009 4707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.547307 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 18:27:42.797245611 +0000 UTC Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.548531 4707 apiserver.go:52] "Watching apiserver" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.553645 4707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.553964 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.554431 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.554890 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.554957 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.555009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.555015 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:45 crc kubenswrapper[4707]: E0129 03:27:45.555080 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:27:45 crc kubenswrapper[4707]: E0129 03:27:45.555204 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.555490 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:45 crc kubenswrapper[4707]: E0129 03:27:45.555606 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.557477 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.557948 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.558679 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.558738 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.558747 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.558893 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.559005 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.558744 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.559412 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.566448 4707 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.583036 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.601441 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.617601 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.633596 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.679063 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.694942 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.706456 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:45 crc kubenswrapper[4707]: I0129 03:27:45.719745 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.501911 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.504874 4707 trace.go:236] Trace[135219374]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Jan-2026 03:27:35.408) (total time: 11096ms): Jan 29 03:27:46 crc kubenswrapper[4707]: Trace[135219374]: ---"Objects listed" error: 11096ms (03:27:46.504) Jan 29 03:27:46 crc kubenswrapper[4707]: Trace[135219374]: [11.096120319s] [11.096120319s] END Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.505335 4707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.510627 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.510888 4707 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.513792 4707 trace.go:236] Trace[1837343613]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Jan-2026 03:27:35.368) (total time: 11145ms): Jan 29 03:27:46 crc kubenswrapper[4707]: Trace[1837343613]: ---"Objects listed" error: 11144ms (03:27:46.513) Jan 29 03:27:46 crc kubenswrapper[4707]: Trace[1837343613]: [11.145235697s] [11.145235697s] END Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.513840 4707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.520693 4707 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.548283 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 16:08:34.872224466 +0000 UTC Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611494 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611517 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611622 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611674 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611794 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611814 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611838 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611926 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612215 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612275 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612296 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612338 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612429 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612711 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612738 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612810 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612894 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612919 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612941 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613013 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613089 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613140 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613441 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613509 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613531 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613596 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613620 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613691 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.611922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.612855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613421 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614246 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.614903 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.613716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615122 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615399 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615476 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615634 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615658 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615709 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615738 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615784 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615810 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615861 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615884 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615913 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.615934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616167 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616385 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616414 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616339 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.617190 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.617376 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.617476 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.617646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618057 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.616422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618251 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618371 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618398 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618402 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618451 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618477 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618503 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618654 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618700 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618822 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618891 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.618976 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619082 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619174 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619328 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620177 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620456 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620496 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620527 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620746 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620774 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620804 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621209 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621269 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621388 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621405 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621420 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621436 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621451 4707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621465 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621478 4707 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621492 4707 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621507 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621520 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621554 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621570 4707 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621584 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621597 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621611 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621622 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621640 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621652 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621667 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621679 4707 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621692 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621709 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621723 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621736 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621750 4707 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621763 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621776 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621790 4707 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621802 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621817 4707 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621832 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621851 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621864 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621876 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621890 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621904 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621918 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621931 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621945 4707 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621958 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621970 4707 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621980 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621992 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622003 4707 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622014 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622025 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622042 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622056 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.625067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619524 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619557 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619796 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619303 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.619778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620560 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.620593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.621683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.622815 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.623002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.630771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.623069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.623094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.623103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.623446 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.623756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.623803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.623986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.623937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624615 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624751 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.624998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.625181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.626484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.626621 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.626898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.626990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.626953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.627127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.627181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.627233 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.627315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.627429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.627710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.627798 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.628141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.628335 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.628352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.628439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.629001 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.629098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.629653 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.630332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.630357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.630485 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:27:47.130451886 +0000 UTC m=+20.614680991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.631333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.630955 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.631499 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.631875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.632607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.632622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.633009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.633085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.633325 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.633730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.633984 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.634121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.634310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.634375 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.634398 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.634489 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:47.13446229 +0000 UTC m=+20.618691195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.634854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.635526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.635960 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.637101 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.637168 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:47.137154053 +0000 UTC m=+20.621382958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.638386 4707 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.639173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.639596 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.640286 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.640351 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.640834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.642242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.642385 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.643730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.644245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.645048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.645193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.645612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.645736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.646202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.646340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.646337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.646916 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.647332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.648704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.649065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.649087 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.649398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.649432 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.649688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.651893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.652460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.652512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.654059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.654684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.654932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.656228 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.656238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.657899 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.658344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.659393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.659700 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.661245 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.661282 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.661298 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.661377 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:47.161353791 +0000 UTC m=+20.645582876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.661625 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.661661 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.661679 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:46 crc kubenswrapper[4707]: E0129 03:27:46.661755 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:47.161729032 +0000 UTC m=+20.645957937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.668338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.673351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.673839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.673862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.676750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.677781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.678219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.678896 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.679041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.679294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.679377 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.679679 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.679860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.679940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.680099 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.685363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.685908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.687449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.687632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.687659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.689766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.690019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.690030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.690343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.690512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.690926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.697061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.697149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.697359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.698242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.700678 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.700731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.704968 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.709053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.709744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722642 4707 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722664 4707 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722679 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722691 4707 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722703 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722716 4707 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722732 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722748 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722761 4707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722773 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722784 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722797 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722813 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722826 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722840 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722854 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722866 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722878 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722889 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722900 4707 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722912 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722924 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722937 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722951 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722964 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722976 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722987 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.722999 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723010 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723022 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723033 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723045 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723057 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723069 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723080 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723092 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723103 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723116 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.723289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724254 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724292 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724304 4707 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724318 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724329 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724341 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724353 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724373 4707 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724389 4707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724400 4707 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724412 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724423 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724433 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724444 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724454 4707 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724466 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724477 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724489 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724500 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724513 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724525 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724557 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724570 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724581 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724594 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724605 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724617 4707 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724628 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724640 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724652 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724665 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724677 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724690 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724702 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724714 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724726 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724740 4707 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724753 4707 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724765 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724781 4707 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724793 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724805 4707 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724819 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724833 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724846 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724859 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724871 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724884 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724900 4707 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724912 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724926 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724937 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724949 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724960 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724973 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724985 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.724997 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725010 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725024 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725036 4707 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725048 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725059 4707 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725071 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725082 4707 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725094 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725106 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725118 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725129 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725141 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725153 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725164 4707 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725177 4707 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725201 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725216 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725229 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725241 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725254 4707 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725265 4707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725277 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725288 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725299 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725310 4707 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725324 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725335 4707 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725346 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725358 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725369 4707 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725380 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725391 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725402 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725414 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725425 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725437 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725447 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725462 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725473 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725485 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725496 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725511 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725523 4707 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725553 4707 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725567 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725578 4707 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725590 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725601 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.725612 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.727048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.728283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.772799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.786743 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.793627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 03:27:46 crc kubenswrapper[4707]: W0129 03:27:46.794711 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-d00b1858e8b717d4250cef7583858b4028f69551346b484641548302ef49eeb1 WatchSource:0}: Error finding container d00b1858e8b717d4250cef7583858b4028f69551346b484641548302ef49eeb1: Status 404 returned error can't find the container with id d00b1858e8b717d4250cef7583858b4028f69551346b484641548302ef49eeb1 Jan 29 03:27:46 crc kubenswrapper[4707]: W0129 03:27:46.801680 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-0b04878742719e24fa70f5069139c1dc501d1104f56ebc4e8691582a603b4d31 WatchSource:0}: Error finding container 0b04878742719e24fa70f5069139c1dc501d1104f56ebc4e8691582a603b4d31: Status 404 returned error can't find the container with id 0b04878742719e24fa70f5069139c1dc501d1104f56ebc4e8691582a603b4d31 Jan 29 03:27:46 crc kubenswrapper[4707]: W0129 03:27:46.811615 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-490dd628ae6166d83ab744905f0fd0d0e6eb77c692439e6f91ceb9252fb5bbd8 WatchSource:0}: Error finding container 490dd628ae6166d83ab744905f0fd0d0e6eb77c692439e6f91ceb9252fb5bbd8: Status 404 returned error can't find the container with id 490dd628ae6166d83ab744905f0fd0d0e6eb77c692439e6f91ceb9252fb5bbd8 Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.826883 4707 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:46 crc kubenswrapper[4707]: I0129 03:27:46.826929 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.230785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.230874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.230898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.230929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.230948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231105 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231127 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231140 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231207 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:48.231184775 +0000 UTC m=+21.715413680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231396 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231481 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:27:48.231466394 +0000 UTC m=+21.715695289 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231501 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231524 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231554 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231506 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:48.231498065 +0000 UTC m=+21.715726970 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231696 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231743 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:48.23165441 +0000 UTC m=+21.715883335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.231821 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:48.231795374 +0000 UTC m=+21.716024279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.245420 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.245493 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.245641 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.245720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.245773 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:27:47 crc kubenswrapper[4707]: E0129 03:27:47.245827 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.247896 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.248431 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.249330 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.250052 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.250799 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.251402 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.252050 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.252709 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.253399 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.254032 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.254526 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.255181 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.257968 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.258483 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.259101 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.259982 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.260259 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.260592 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.261401 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.262013 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.262653 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.263527 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.264177 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.265131 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.265805 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.266202 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.267263 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.268043 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.269050 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.269661 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.270514 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.271061 4707 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.271169 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.272236 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.273773 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.274315 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.274754 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.277744 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.279488 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.280575 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.282086 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.282873 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.283834 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.284504 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.286482 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.287250 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.288227 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.288930 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.289721 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.290393 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.291764 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.292761 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.293233 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.293766 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.294721 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.295330 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.296359 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.296985 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.303157 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.311898 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.312263 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.314595 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.325238 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.341779 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.361717 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.377402 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.385160 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0b04878742719e24fa70f5069139c1dc501d1104f56ebc4e8691582a603b4d31"} Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.387191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d"} Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.387258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d00b1858e8b717d4250cef7583858b4028f69551346b484641548302ef49eeb1"} Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.389285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b"} Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.389402 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d"} Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.389467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"490dd628ae6166d83ab744905f0fd0d0e6eb77c692439e6f91ceb9252fb5bbd8"} Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.389832 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.399798 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.410929 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.423708 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.435301 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.445361 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.456013 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.466327 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.477763 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.489407 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.509848 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.527512 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.544143 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.548836 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 00:52:47.470045871 +0000 UTC Jan 29 03:27:47 crc kubenswrapper[4707]: I0129 03:27:47.558195 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:48 crc kubenswrapper[4707]: I0129 03:27:48.240415 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:48 crc kubenswrapper[4707]: I0129 03:27:48.240530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:48 crc kubenswrapper[4707]: I0129 03:27:48.240602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:48 crc kubenswrapper[4707]: I0129 03:27:48.240640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.240791 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.240816 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.240836 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.241088 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.241135 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.241161 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.241276 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.240792 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:27:50.240731565 +0000 UTC m=+23.724960480 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:27:48 crc kubenswrapper[4707]: I0129 03:27:48.241499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.241583 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:50.241528239 +0000 UTC m=+23.725757294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.241601 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.241839 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:50.241734706 +0000 UTC m=+23.725963621 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.241939 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:50.241924092 +0000 UTC m=+23.726153017 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:48 crc kubenswrapper[4707]: E0129 03:27:48.242034 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:50.242024585 +0000 UTC m=+23.726253500 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:48 crc kubenswrapper[4707]: I0129 03:27:48.549594 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 19:28:06.630382172 +0000 UTC Jan 29 03:27:49 crc kubenswrapper[4707]: I0129 03:27:49.243473 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:49 crc kubenswrapper[4707]: I0129 03:27:49.243521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:49 crc kubenswrapper[4707]: I0129 03:27:49.243521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:49 crc kubenswrapper[4707]: E0129 03:27:49.243685 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:27:49 crc kubenswrapper[4707]: E0129 03:27:49.243796 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:27:49 crc kubenswrapper[4707]: E0129 03:27:49.243887 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:27:49 crc kubenswrapper[4707]: I0129 03:27:49.550503 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 16:23:45.297938741 +0000 UTC Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.261630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.261765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.261809 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.261850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.261888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.261990 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262026 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262051 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262060 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:54.262041062 +0000 UTC m=+27.746269957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262072 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262172 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262186 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:27:54.262084163 +0000 UTC m=+27.746313098 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262280 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262345 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262352 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:54.262276539 +0000 UTC m=+27.746505484 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262371 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262450 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:54.262421124 +0000 UTC m=+27.746650069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:50 crc kubenswrapper[4707]: E0129 03:27:50.262640 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 03:27:54.262528557 +0000 UTC m=+27.746757662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.400624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43"} Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.420671 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:50Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.441436 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:50Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.463126 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:50Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.483293 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:50Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.504938 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:50Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.526757 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:50Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.551055 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 07:27:53.902890453 +0000 UTC Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.557794 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:50Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:50 crc kubenswrapper[4707]: I0129 03:27:50.574566 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:50Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.243142 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.243240 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.243311 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:51 crc kubenswrapper[4707]: E0129 03:27:51.243400 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:27:51 crc kubenswrapper[4707]: E0129 03:27:51.243528 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:27:51 crc kubenswrapper[4707]: E0129 03:27:51.243701 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.551981 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 18:50:38.426254247 +0000 UTC Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.710837 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.723860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.725073 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.742286 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.762164 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.783418 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.799851 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.812760 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.826279 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.841608 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.855795 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.869657 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.883585 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.897887 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.920722 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.939394 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.955452 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.976861 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:51 crc kubenswrapper[4707]: I0129 03:27:51.992780 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:51Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.010937 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.030596 4707 csr.go:261] certificate signing request csr-wc9gt is approved, waiting to be issued Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.077305 4707 csr.go:257] certificate signing request csr-wc9gt is issued Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.145436 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.162897 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.175457 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.191015 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.222509 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.244256 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.279042 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.295155 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.307577 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.325810 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: E0129 03:27:52.416136 4707 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.552234 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 01:59:14.738332812 +0000 UTC Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.910974 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.913170 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.913722 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.913738 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.913826 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.925496 4707 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.925948 4707 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.927338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.927386 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.927397 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.927421 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.927433 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:52Z","lastTransitionTime":"2026-01-29T03:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:52 crc kubenswrapper[4707]: E0129 03:27:52.946161 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.953463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.953521 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.953551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.953575 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.953586 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:52Z","lastTransitionTime":"2026-01-29T03:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:52 crc kubenswrapper[4707]: E0129 03:27:52.968260 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.972850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.972888 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.972902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.972922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:52 crc kubenswrapper[4707]: I0129 03:27:52.972937 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:52Z","lastTransitionTime":"2026-01-29T03:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:52 crc kubenswrapper[4707]: E0129 03:27:52.999088 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:52Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.006390 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.006438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.006448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.006469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.006404 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hbz9l"] Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.006481 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.006875 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-t4vft"] Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.007034 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-lnjls"] Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.007150 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.007226 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t4vft" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.008223 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.010582 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-vh9xt"] Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.010905 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.012345 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.012345 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.012769 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.012909 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.012930 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.013098 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.013302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.013311 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.013385 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.013425 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.013445 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.013559 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.013817 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.017598 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.018186 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.041123 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: E0129 03:27:53.049858 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.057202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.057249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.057260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.057296 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.057309 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.060861 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: E0129 03:27:53.071519 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: E0129 03:27:53.071673 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.073832 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.073869 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.073879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.073897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.073909 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.078122 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-29 03:22:52 +0000 UTC, rotation deadline is 2026-12-04 17:26:50.041082124 +0000 UTC Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.078203 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7429h58m56.962883179s for next certificate rotation Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.078256 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-os-release\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-var-lib-cni-bin\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-run-multus-certs\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-cnibin\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df12d101-b13d-4276-94b7-422c6609d2e8-mcd-auth-proxy-config\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091667 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-system-cni-dir\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxcpr\" (UniqueName: \"kubernetes.io/projected/bd938209-46da-4f33-8496-23beb193ac96-kube-api-access-bxcpr\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df12d101-b13d-4276-94b7-422c6609d2e8-proxy-tls\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091727 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd938209-46da-4f33-8496-23beb193ac96-cni-binary-copy\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-var-lib-cni-multus\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzldv\" (UniqueName: \"kubernetes.io/projected/df12d101-b13d-4276-94b7-422c6609d2e8-kube-api-access-zzldv\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-run-netns\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.091971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bedc9546-6a4d-44ec-b95f-84c3329307cf-cni-binary-copy\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbztl\" (UniqueName: \"kubernetes.io/projected/bedc9546-6a4d-44ec-b95f-84c3329307cf-kube-api-access-lbztl\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/df12d101-b13d-4276-94b7-422c6609d2e8-rootfs\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-multus-cni-dir\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-system-cni-dir\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bedc9546-6a4d-44ec-b95f-84c3329307cf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-multus-socket-dir-parent\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092232 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-run-k8s-cni-cncf-io\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-hostroot\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092384 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d25a01e8-a854-424f-a238-e41c41cea5f3-hosts-file\") pod \"node-resolver-t4vft\" (UID: \"d25a01e8-a854-424f-a238-e41c41cea5f3\") " pod="openshift-dns/node-resolver-t4vft" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-cnibin\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092478 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-var-lib-kubelet\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-os-release\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhhx5\" (UniqueName: \"kubernetes.io/projected/d25a01e8-a854-424f-a238-e41c41cea5f3-kube-api-access-mhhx5\") pod \"node-resolver-t4vft\" (UID: \"d25a01e8-a854-424f-a238-e41c41cea5f3\") " pod="openshift-dns/node-resolver-t4vft" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-multus-conf-dir\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-etc-kubernetes\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.092627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bd938209-46da-4f33-8496-23beb193ac96-multus-daemon-config\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.109041 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.128069 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.143331 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.158878 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.173475 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.176714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.176848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.176929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.177012 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.177073 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.187249 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.193719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-multus-socket-dir-parent\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.193919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-multus-socket-dir-parent\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.193936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-run-k8s-cni-cncf-io\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-hostroot\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194083 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d25a01e8-a854-424f-a238-e41c41cea5f3-hosts-file\") pod \"node-resolver-t4vft\" (UID: \"d25a01e8-a854-424f-a238-e41c41cea5f3\") " pod="openshift-dns/node-resolver-t4vft" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-cnibin\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-var-lib-kubelet\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-os-release\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194153 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhhx5\" (UniqueName: \"kubernetes.io/projected/d25a01e8-a854-424f-a238-e41c41cea5f3-kube-api-access-mhhx5\") pod \"node-resolver-t4vft\" (UID: \"d25a01e8-a854-424f-a238-e41c41cea5f3\") " pod="openshift-dns/node-resolver-t4vft" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-multus-conf-dir\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-etc-kubernetes\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bd938209-46da-4f33-8496-23beb193ac96-multus-daemon-config\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194213 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-hostroot\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-os-release\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-etc-kubernetes\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-cnibin\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-run-multus-certs\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-os-release\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194395 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-var-lib-cni-bin\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-multus-conf-dir\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-run-multus-certs\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-var-lib-kubelet\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-var-lib-cni-bin\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-os-release\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df12d101-b13d-4276-94b7-422c6609d2e8-mcd-auth-proxy-config\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-system-cni-dir\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d25a01e8-a854-424f-a238-e41c41cea5f3-hosts-file\") pod \"node-resolver-t4vft\" (UID: \"d25a01e8-a854-424f-a238-e41c41cea5f3\") " pod="openshift-dns/node-resolver-t4vft" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-run-k8s-cni-cncf-io\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-system-cni-dir\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxcpr\" (UniqueName: \"kubernetes.io/projected/bd938209-46da-4f33-8496-23beb193ac96-kube-api-access-bxcpr\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-cnibin\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df12d101-b13d-4276-94b7-422c6609d2e8-proxy-tls\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194869 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd938209-46da-4f33-8496-23beb193ac96-cni-binary-copy\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-var-lib-cni-multus\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-cnibin\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzldv\" (UniqueName: \"kubernetes.io/projected/df12d101-b13d-4276-94b7-422c6609d2e8-kube-api-access-zzldv\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.194984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-run-netns\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-var-lib-cni-multus\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bedc9546-6a4d-44ec-b95f-84c3329307cf-cni-binary-copy\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195054 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-host-run-netns\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bd938209-46da-4f33-8496-23beb193ac96-multus-daemon-config\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbztl\" (UniqueName: \"kubernetes.io/projected/bedc9546-6a4d-44ec-b95f-84c3329307cf-kube-api-access-lbztl\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/df12d101-b13d-4276-94b7-422c6609d2e8-rootfs\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-multus-cni-dir\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-system-cni-dir\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195194 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/df12d101-b13d-4276-94b7-422c6609d2e8-rootfs\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bedc9546-6a4d-44ec-b95f-84c3329307cf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bedc9546-6a4d-44ec-b95f-84c3329307cf-system-cni-dir\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd938209-46da-4f33-8496-23beb193ac96-multus-cni-dir\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df12d101-b13d-4276-94b7-422c6609d2e8-mcd-auth-proxy-config\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd938209-46da-4f33-8496-23beb193ac96-cni-binary-copy\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bedc9546-6a4d-44ec-b95f-84c3329307cf-cni-binary-copy\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.195959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bedc9546-6a4d-44ec-b95f-84c3329307cf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.207961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df12d101-b13d-4276-94b7-422c6609d2e8-proxy-tls\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.210189 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.210892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhhx5\" (UniqueName: \"kubernetes.io/projected/d25a01e8-a854-424f-a238-e41c41cea5f3-kube-api-access-mhhx5\") pod \"node-resolver-t4vft\" (UID: \"d25a01e8-a854-424f-a238-e41c41cea5f3\") " pod="openshift-dns/node-resolver-t4vft" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.212004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzldv\" (UniqueName: \"kubernetes.io/projected/df12d101-b13d-4276-94b7-422c6609d2e8-kube-api-access-zzldv\") pod \"machine-config-daemon-hbz9l\" (UID: \"df12d101-b13d-4276-94b7-422c6609d2e8\") " pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.213711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxcpr\" (UniqueName: \"kubernetes.io/projected/bd938209-46da-4f33-8496-23beb193ac96-kube-api-access-bxcpr\") pod \"multus-vh9xt\" (UID: \"bd938209-46da-4f33-8496-23beb193ac96\") " pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.217299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbztl\" (UniqueName: \"kubernetes.io/projected/bedc9546-6a4d-44ec-b95f-84c3329307cf-kube-api-access-lbztl\") pod \"multus-additional-cni-plugins-lnjls\" (UID: \"bedc9546-6a4d-44ec-b95f-84c3329307cf\") " pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.224770 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.237176 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.242870 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:53 crc kubenswrapper[4707]: E0129 03:27:53.243046 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.242894 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:53 crc kubenswrapper[4707]: E0129 03:27:53.243139 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.243091 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:53 crc kubenswrapper[4707]: E0129 03:27:53.243195 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.256340 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.270333 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.279097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.279163 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.279174 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.279196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.279209 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.288461 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.301487 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.321726 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.324713 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t4vft" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.336494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.337979 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.345467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lnjls" Jan 29 03:27:53 crc kubenswrapper[4707]: W0129 03:27:53.349935 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf12d101_b13d_4276_94b7_422c6609d2e8.slice/crio-032ad3f78edcc7eec20d14216542ba630a44e1326e401913365ab6af9b9aeddf WatchSource:0}: Error finding container 032ad3f78edcc7eec20d14216542ba630a44e1326e401913365ab6af9b9aeddf: Status 404 returned error can't find the container with id 032ad3f78edcc7eec20d14216542ba630a44e1326e401913365ab6af9b9aeddf Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.354492 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.356452 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vh9xt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.377855 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.386408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.386448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.386463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.386484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.386496 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.392314 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nn7fm"] Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.393402 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.395626 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.395809 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.395809 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.396143 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.396225 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.396525 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.396683 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.398585 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.419932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vh9xt" event={"ID":"bd938209-46da-4f33-8496-23beb193ac96","Type":"ContainerStarted","Data":"74c0a214c2d95107a50dc96f81bb4025f089796e8d6256db5bb59dce8e0dd1ff"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.420929 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.422758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" event={"ID":"bedc9546-6a4d-44ec-b95f-84c3329307cf","Type":"ContainerStarted","Data":"1d5cc877d6d652413d4bb99709004e8444a8a0265562fbac5ffb47cdce140446"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.424239 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"032ad3f78edcc7eec20d14216542ba630a44e1326e401913365ab6af9b9aeddf"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.425686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t4vft" event={"ID":"d25a01e8-a854-424f-a238-e41c41cea5f3","Type":"ContainerStarted","Data":"38eb869c0e336725c2939a19519534c63d4e85410a5f578373d0f04ee29e8587"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.435931 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.450777 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.470182 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.486819 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-netns\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-netd\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497080 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-kubelet\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-systemd-units\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-bin\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497138 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-var-lib-openvswitch\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-openvswitch\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxg4m\" (UniqueName: \"kubernetes.io/projected/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-kube-api-access-wxg4m\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-env-overrides\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-slash\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-etc-openvswitch\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-config\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovn-node-metrics-cert\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-systemd\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-log-socket\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497374 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497391 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-node-log\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497405 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-script-lib\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.497418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-ovn\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.501837 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.501884 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.501895 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.501914 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.501925 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.538056 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.553670 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 12:09:47.282828765 +0000 UTC Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.588516 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-netns\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-netd\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-kubelet\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-systemd-units\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-bin\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-var-lib-openvswitch\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-openvswitch\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxg4m\" (UniqueName: \"kubernetes.io/projected/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-kube-api-access-wxg4m\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-env-overrides\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598804 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-slash\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-etc-openvswitch\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-config\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovn-node-metrics-cert\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-systemd\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-log-socket\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598924 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-node-log\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-script-lib\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.598988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-ovn\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599048 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-ovn\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-netns\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-netd\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-kubelet\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-systemd-units\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-bin\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599213 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-var-lib-openvswitch\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-openvswitch\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.599996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-env-overrides\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.600042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-slash\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.600065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-etc-openvswitch\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.600462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-config\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.600705 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.600739 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-systemd\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.600736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-node-log\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.600823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-log-socket\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.601546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-script-lib\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.606466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovn-node-metrics-cert\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.606861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.606901 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.606913 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.606933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.606944 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.616170 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.623702 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxg4m\" (UniqueName: \"kubernetes.io/projected/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-kube-api-access-wxg4m\") pod \"ovnkube-node-nn7fm\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.632278 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.646816 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.661266 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.674907 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.692515 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.704750 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.709913 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.709956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.709967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.710012 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.710024 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.712226 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.719966 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: W0129 03:27:53.724286 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3eccef7_1d8e_42b5_b7c8_2cd378b7465a.slice/crio-6669b176e855d9fbffff2fdbd7431d03f35e4b019d06e6672dcb1aad7085471b WatchSource:0}: Error finding container 6669b176e855d9fbffff2fdbd7431d03f35e4b019d06e6672dcb1aad7085471b: Status 404 returned error can't find the container with id 6669b176e855d9fbffff2fdbd7431d03f35e4b019d06e6672dcb1aad7085471b Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.746721 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:53Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.813353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.813416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.813430 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.813484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.813498 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.917071 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.917123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.917133 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.917150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:53 crc kubenswrapper[4707]: I0129 03:27:53.917169 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:53Z","lastTransitionTime":"2026-01-29T03:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.021232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.021286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.021304 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.021328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.021346 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.124705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.124746 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.124757 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.124771 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.124781 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.227779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.227807 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.227816 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.227830 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.227839 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.305898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306106 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:28:02.30606754 +0000 UTC m=+35.790296465 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.306197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.306249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.306293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.306317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306388 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306446 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:02.306430611 +0000 UTC m=+35.790659506 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306459 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306503 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306521 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306554 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306577 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:02.306530764 +0000 UTC m=+35.790759709 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306606 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:02.306593596 +0000 UTC m=+35.790822541 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306629 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306642 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306651 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:54 crc kubenswrapper[4707]: E0129 03:27:54.306683 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:02.306673058 +0000 UTC m=+35.790901973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.331033 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.331129 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.331147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.331171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.331189 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.430827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t4vft" event={"ID":"d25a01e8-a854-424f-a238-e41c41cea5f3","Type":"ContainerStarted","Data":"31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.433148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.433187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.433229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.433247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.433263 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.433355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vh9xt" event={"ID":"bd938209-46da-4f33-8496-23beb193ac96","Type":"ContainerStarted","Data":"7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.435529 4707 generic.go:334] "Generic (PLEG): container finished" podID="bedc9546-6a4d-44ec-b95f-84c3329307cf" containerID="d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457" exitCode=0 Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.435620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" event={"ID":"bedc9546-6a4d-44ec-b95f-84c3329307cf","Type":"ContainerDied","Data":"d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.441502 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.441702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.448873 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463" exitCode=0 Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.448932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.448968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"6669b176e855d9fbffff2fdbd7431d03f35e4b019d06e6672dcb1aad7085471b"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.449455 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.478727 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.496596 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.511941 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.560233 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 09:30:49.443955592 +0000 UTC Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.566202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.566228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.566239 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.566257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.566273 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.572568 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.589835 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.602977 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.618736 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.644393 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.658489 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.670039 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.670082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.670091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.670108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.670119 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.672321 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.690055 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.704155 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.719314 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.739458 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.751673 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.769575 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.774639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.774686 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.774695 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.774712 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.774723 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.788519 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.805518 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.825062 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.847699 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.879798 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.880101 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.880187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.880272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.880408 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.880700 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.898884 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.919982 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.941621 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.955180 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.975611 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.988657 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.988705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.988716 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.988734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.988748 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:54Z","lastTransitionTime":"2026-01-29T03:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:54 crc kubenswrapper[4707]: I0129 03:27:54.998068 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.092184 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.092508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.092519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.092568 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.092584 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:55Z","lastTransitionTime":"2026-01-29T03:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.197880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.197920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.197933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.197952 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.197966 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:55Z","lastTransitionTime":"2026-01-29T03:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.243993 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:55 crc kubenswrapper[4707]: E0129 03:27:55.244230 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.244454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.244712 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:55 crc kubenswrapper[4707]: E0129 03:27:55.244795 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:27:55 crc kubenswrapper[4707]: E0129 03:27:55.244703 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.301365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.301403 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.301413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.301431 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.301443 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:55Z","lastTransitionTime":"2026-01-29T03:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.405223 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.405276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.405290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.405311 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.405325 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:55Z","lastTransitionTime":"2026-01-29T03:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.460191 4707 generic.go:334] "Generic (PLEG): container finished" podID="bedc9546-6a4d-44ec-b95f-84c3329307cf" containerID="ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d" exitCode=0 Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.460288 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" event={"ID":"bedc9546-6a4d-44ec-b95f-84c3329307cf","Type":"ContainerDied","Data":"ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.469831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.470376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.470389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.470423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.470436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.471003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.480730 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.495446 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.509354 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.509499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.509516 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.509524 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.509569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.509581 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:55Z","lastTransitionTime":"2026-01-29T03:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.525381 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.538703 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.551602 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.560425 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 10:13:57.995704848 +0000 UTC Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.566916 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.586666 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.600853 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.614363 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.617457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.617487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.617499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.617518 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.617532 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:55Z","lastTransitionTime":"2026-01-29T03:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.627493 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.642068 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.657490 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.681510 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.696245 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pf578"] Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.696832 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.699143 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.699167 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.699604 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.701358 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.720355 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.720612 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.720892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.721014 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.721142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.721242 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:55Z","lastTransitionTime":"2026-01-29T03:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.770420 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.793260 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.812417 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.823259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.823293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.823303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.823319 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.823331 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:55Z","lastTransitionTime":"2026-01-29T03:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.826199 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c3b3d28-c2ba-4aea-b865-e72c6327eb5b-host\") pod \"node-ca-pf578\" (UID: \"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\") " pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.826268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5zbx\" (UniqueName: \"kubernetes.io/projected/1c3b3d28-c2ba-4aea-b865-e72c6327eb5b-kube-api-access-t5zbx\") pod \"node-ca-pf578\" (UID: \"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\") " pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.826304 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c3b3d28-c2ba-4aea-b865-e72c6327eb5b-serviceca\") pod \"node-ca-pf578\" (UID: \"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\") " pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.828286 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.840288 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.851014 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.862071 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.874805 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.894531 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.906910 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.918792 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.926113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.926169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.926184 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.926204 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.926219 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:55Z","lastTransitionTime":"2026-01-29T03:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.926781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c3b3d28-c2ba-4aea-b865-e72c6327eb5b-host\") pod \"node-ca-pf578\" (UID: \"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\") " pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.926839 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5zbx\" (UniqueName: \"kubernetes.io/projected/1c3b3d28-c2ba-4aea-b865-e72c6327eb5b-kube-api-access-t5zbx\") pod \"node-ca-pf578\" (UID: \"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\") " pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.926875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c3b3d28-c2ba-4aea-b865-e72c6327eb5b-serviceca\") pod \"node-ca-pf578\" (UID: \"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\") " pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.927118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c3b3d28-c2ba-4aea-b865-e72c6327eb5b-host\") pod \"node-ca-pf578\" (UID: \"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\") " pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.927847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c3b3d28-c2ba-4aea-b865-e72c6327eb5b-serviceca\") pod \"node-ca-pf578\" (UID: \"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\") " pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.935325 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.949108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5zbx\" (UniqueName: \"kubernetes.io/projected/1c3b3d28-c2ba-4aea-b865-e72c6327eb5b-kube-api-access-t5zbx\") pod \"node-ca-pf578\" (UID: \"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\") " pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.951449 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:55 crc kubenswrapper[4707]: I0129 03:27:55.971701 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:55Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.010131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pf578" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.029563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: W0129 03:27:56.029594 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c3b3d28_c2ba_4aea_b865_e72c6327eb5b.slice/crio-96616b524f0833a9c934d8a643383c7579f6434d4fe2268220deacea2bb1f883 WatchSource:0}: Error finding container 96616b524f0833a9c934d8a643383c7579f6434d4fe2268220deacea2bb1f883: Status 404 returned error can't find the container with id 96616b524f0833a9c934d8a643383c7579f6434d4fe2268220deacea2bb1f883 Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.029637 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.029658 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.029687 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.029706 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.132448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.132514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.132532 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.132583 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.132601 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.236022 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.236101 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.236121 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.236147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.236164 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.339450 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.339508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.339519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.339569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.339596 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.442388 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.442446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.442455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.442472 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.442482 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.475991 4707 generic.go:334] "Generic (PLEG): container finished" podID="bedc9546-6a4d-44ec-b95f-84c3329307cf" containerID="5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671" exitCode=0 Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.476351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" event={"ID":"bedc9546-6a4d-44ec-b95f-84c3329307cf","Type":"ContainerDied","Data":"5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.478621 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pf578" event={"ID":"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b","Type":"ContainerStarted","Data":"4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.478659 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pf578" event={"ID":"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b","Type":"ContainerStarted","Data":"96616b524f0833a9c934d8a643383c7579f6434d4fe2268220deacea2bb1f883"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.491034 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.512492 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.526432 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.540990 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.546022 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.546077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.546090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.546116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.546132 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.554262 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.561637 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 04:46:03.767196419 +0000 UTC Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.572158 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.585943 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.598531 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.612939 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.627080 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.642104 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.649820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.649855 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.649870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.649891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.649906 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.654760 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.667620 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.681350 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.705819 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.720929 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.739625 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.752659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.752706 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.752720 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.752740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.752753 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.753998 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.770490 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.783280 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.798259 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.808631 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.828266 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.840348 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.852724 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.856134 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.856190 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.856210 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.856234 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.856250 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.866029 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.875996 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.896768 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.908251 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.925588 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:56Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.959221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.959300 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.959324 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.959354 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.959374 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:56Z","lastTransitionTime":"2026-01-29T03:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:56 crc kubenswrapper[4707]: I0129 03:27:56.994955 4707 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.062215 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.062255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.062265 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.062283 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.062295 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.165704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.166092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.166169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.166234 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.166339 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.242598 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:57 crc kubenswrapper[4707]: E0129 03:27:57.243042 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.242776 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:57 crc kubenswrapper[4707]: E0129 03:27:57.243568 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.242741 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:57 crc kubenswrapper[4707]: E0129 03:27:57.243788 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.258061 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.269363 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.269722 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.269802 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.269881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.269946 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.277066 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.290156 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.323727 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.351258 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.369645 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.372952 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.373116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.373228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.373321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.373406 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.382174 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.394986 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.419350 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.438163 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.462218 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.476737 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.476925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.477018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.477084 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.477152 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.480294 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.487159 4707 generic.go:334] "Generic (PLEG): container finished" podID="bedc9546-6a4d-44ec-b95f-84c3329307cf" containerID="1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64" exitCode=0 Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.487406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" event={"ID":"bedc9546-6a4d-44ec-b95f-84c3329307cf","Type":"ContainerDied","Data":"1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.499014 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.516941 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.538031 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.555179 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.562046 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 14:56:53.007942195 +0000 UTC Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.570572 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.580515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.580595 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.580610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.580635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.580651 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.586451 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.604586 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.628337 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.644653 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.662379 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.677666 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.683104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.683152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.683162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.683179 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.683189 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.692789 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.709746 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.725255 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.740433 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.754058 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.780921 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.785912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.785956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.785968 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.785986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.785996 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.819732 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.889445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.889809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.889879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.889954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.890012 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.993055 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.993141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.993153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.993171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:57 crc kubenswrapper[4707]: I0129 03:27:57.993181 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:57Z","lastTransitionTime":"2026-01-29T03:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.095766 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.095810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.095820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.095838 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.095851 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:58Z","lastTransitionTime":"2026-01-29T03:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.198728 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.198773 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.198785 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.198803 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.198814 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:58Z","lastTransitionTime":"2026-01-29T03:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.301823 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.302232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.302245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.302267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.302281 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:58Z","lastTransitionTime":"2026-01-29T03:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.405439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.405481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.405491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.405509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.405524 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:58Z","lastTransitionTime":"2026-01-29T03:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.496202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.500135 4707 generic.go:334] "Generic (PLEG): container finished" podID="bedc9546-6a4d-44ec-b95f-84c3329307cf" containerID="b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9" exitCode=0 Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.500176 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" event={"ID":"bedc9546-6a4d-44ec-b95f-84c3329307cf","Type":"ContainerDied","Data":"b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.511442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.511474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.511484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.511500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.511510 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:58Z","lastTransitionTime":"2026-01-29T03:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.512643 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.533568 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.545724 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.556466 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.563213 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 03:08:17.356399429 +0000 UTC Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.570132 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.584887 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.597908 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.611683 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.616143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.616177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.616188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.616203 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.616214 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:58Z","lastTransitionTime":"2026-01-29T03:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.624071 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.635874 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.652354 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.664089 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.677461 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.687977 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.706589 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:58Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.719907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.719970 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.719982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.720007 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.720021 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:58Z","lastTransitionTime":"2026-01-29T03:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.823653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.823699 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.823709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.823728 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.823742 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:58Z","lastTransitionTime":"2026-01-29T03:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.927068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.927122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.927135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.927154 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:58 crc kubenswrapper[4707]: I0129 03:27:58.927168 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:58Z","lastTransitionTime":"2026-01-29T03:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.029923 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.029983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.029995 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.030012 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.030023 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.132733 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.133001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.133081 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.133126 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.133155 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.236372 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.236453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.236466 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.236490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.236504 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.243738 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:27:59 crc kubenswrapper[4707]: E0129 03:27:59.244251 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.244330 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:27:59 crc kubenswrapper[4707]: E0129 03:27:59.244468 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.244683 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:27:59 crc kubenswrapper[4707]: E0129 03:27:59.245145 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.340027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.340078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.340099 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.340123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.340139 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.443671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.443740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.443753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.443783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.443798 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.508771 4707 generic.go:334] "Generic (PLEG): container finished" podID="bedc9546-6a4d-44ec-b95f-84c3329307cf" containerID="fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86" exitCode=0 Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.508883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" event={"ID":"bedc9546-6a4d-44ec-b95f-84c3329307cf","Type":"ContainerDied","Data":"fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.527530 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.543615 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.547028 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.547087 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.547102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.547127 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.547140 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.558280 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.564304 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 12:24:50.332063944 +0000 UTC Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.573172 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.593007 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.607284 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.622061 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.635213 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.647920 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.649956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.649982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.649992 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.650035 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.650054 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.661229 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.685264 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.700852 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.717168 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.731194 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.747906 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:27:59Z is after 2025-08-24T17:21:41Z" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.752788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.752841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.752854 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.752874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.752890 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.855329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.855365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.855375 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.855393 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.855406 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.958527 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.958727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.958740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.958768 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:27:59 crc kubenswrapper[4707]: I0129 03:27:59.958782 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:27:59Z","lastTransitionTime":"2026-01-29T03:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.061900 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.061947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.061960 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.061990 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.062007 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.165195 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.165339 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.165359 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.165417 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.165437 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.268276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.268404 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.268416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.268439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.268453 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.370916 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.370962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.370977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.370998 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.371009 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.474321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.474403 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.474424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.474452 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.474471 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.517923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.518414 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.518605 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.518665 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.524133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" event={"ID":"bedc9546-6a4d-44ec-b95f-84c3329307cf","Type":"ContainerStarted","Data":"5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.540120 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.552211 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.554815 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.555972 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.565177 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 16:30:27.627826364 +0000 UTC Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.578056 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.578112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.578134 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.578162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.578185 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.598214 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.626451 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.649260 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.669207 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.681159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.681208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.681221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.681253 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.681270 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.686376 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.699614 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.711941 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.728108 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.748695 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.765751 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.781288 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.783925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.783970 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.783983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.784005 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.784018 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.798315 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.811344 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.826956 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.847047 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.865532 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.883755 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.886734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.886765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.886775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.886792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.886806 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.898834 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.913825 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.929497 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.944017 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.954094 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.973433 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.987010 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.989330 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.989391 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.989408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.989434 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:00 crc kubenswrapper[4707]: I0129 03:28:00.989449 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:00Z","lastTransitionTime":"2026-01-29T03:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.001009 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:00Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.012311 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:01Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.025001 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:01Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.037272 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:01Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.091997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.092056 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.092068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.092089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.092102 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:01Z","lastTransitionTime":"2026-01-29T03:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.194689 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.194774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.194800 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.194836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.194864 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:01Z","lastTransitionTime":"2026-01-29T03:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.243005 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.243031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:01 crc kubenswrapper[4707]: E0129 03:28:01.243276 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:01 crc kubenswrapper[4707]: E0129 03:28:01.243356 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.243065 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:01 crc kubenswrapper[4707]: E0129 03:28:01.243523 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.297748 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.297841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.297873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.297908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.297938 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:01Z","lastTransitionTime":"2026-01-29T03:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.400442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.400504 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.400515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.400550 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.400564 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:01Z","lastTransitionTime":"2026-01-29T03:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.503400 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.503488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.503513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.503576 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.503598 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:01Z","lastTransitionTime":"2026-01-29T03:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.565410 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 00:25:12.280264665 +0000 UTC Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.606964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.607045 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.607070 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.607109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.607137 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:01Z","lastTransitionTime":"2026-01-29T03:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.710769 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.710830 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.710850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.710877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.710897 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:01Z","lastTransitionTime":"2026-01-29T03:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.814735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.814799 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.814818 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.814842 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.814860 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:01Z","lastTransitionTime":"2026-01-29T03:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.918096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.918235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.918266 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.918299 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:01 crc kubenswrapper[4707]: I0129 03:28:01.918322 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:01Z","lastTransitionTime":"2026-01-29T03:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.020819 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.020872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.020886 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.020904 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.020916 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.124034 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.124090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.124106 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.124131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.124146 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.227613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.227690 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.227700 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.227719 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.227732 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.331187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.331277 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.331301 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.331338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.331365 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.396685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.396796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.396824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.396850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.396872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397018 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397037 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397048 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397096 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:18.397081538 +0000 UTC m=+51.881310443 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397482 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:28:18.39747175 +0000 UTC m=+51.881700655 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397532 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397573 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:18.397567173 +0000 UTC m=+51.881796078 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397609 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397629 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:18.397623515 +0000 UTC m=+51.881852420 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397671 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397680 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397688 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:02 crc kubenswrapper[4707]: E0129 03:28:02.397729 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:18.397723608 +0000 UTC m=+51.881952513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.435393 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.435477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.435502 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.435594 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.435627 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.538269 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.538318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.538333 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.538358 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.538373 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.566600 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 10:01:30.068819936 +0000 UTC Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.641755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.641813 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.641837 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.641865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.641881 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.745583 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.745656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.745686 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.745723 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.745751 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.848692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.849445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.849634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.849727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.849814 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.952560 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.952861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.952924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.953010 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:02 crc kubenswrapper[4707]: I0129 03:28:02.953068 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:02Z","lastTransitionTime":"2026-01-29T03:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.057052 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.057099 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.057115 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.057137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.057150 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.137907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.137964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.137983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.138006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.138023 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: E0129 03:28:03.158444 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.162836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.162868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.162880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.162897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.162909 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: E0129 03:28:03.185041 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.196245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.196326 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.196352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.196382 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.196405 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: E0129 03:28:03.214816 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.219793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.219847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.219866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.219891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.219910 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: E0129 03:28:03.234986 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.239515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.239620 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.239648 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.239683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.239708 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.242912 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.242932 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:03 crc kubenswrapper[4707]: E0129 03:28:03.243027 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.243433 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:03 crc kubenswrapper[4707]: E0129 03:28:03.243507 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:03 crc kubenswrapper[4707]: E0129 03:28:03.243601 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:03 crc kubenswrapper[4707]: E0129 03:28:03.258510 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: E0129 03:28:03.258773 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.261383 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.261467 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.261492 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.261519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.261599 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.364534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.364622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.364641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.364668 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.364688 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.468641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.468707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.468728 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.468754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.468771 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.537400 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/0.log" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.543018 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b" exitCode=1 Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.543107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.544601 4707 scope.go:117] "RemoveContainer" containerID="b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.567236 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 17:47:02.311067458 +0000 UTC Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.568055 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.573488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.573568 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.573591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.573619 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.573639 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.590732 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.610175 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.634896 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.656205 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.678024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.678996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.679287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.679477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.680057 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.679513 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.701149 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.736203 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.760414 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.777445 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.785123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.785171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.785190 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.785216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.785235 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.792870 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.803847 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.818102 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.834063 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.862115 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:02Z\\\",\\\"message\\\":\\\":02.650989 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 03:28:02.651440 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 03:28:02.651467 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:02.651003 6031 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:02.651871 6031 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.651875 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:02.651966 6031 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652168 6031 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652370 6031 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652784 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:02.652806 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:02.652848 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:02.652868 6031 factory.go:656] Stopping watch factory\\\\nI0129 03:28:02.652897 6031 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:03Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.892932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.892987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.893004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.893028 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.893047 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.996646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.996708 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.996730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.996760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:03 crc kubenswrapper[4707]: I0129 03:28:03.996786 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:03Z","lastTransitionTime":"2026-01-29T03:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.100707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.100788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.100814 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.100844 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.100867 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:04Z","lastTransitionTime":"2026-01-29T03:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.203733 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.203787 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.203803 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.203827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.203841 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:04Z","lastTransitionTime":"2026-01-29T03:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.307175 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.307231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.307244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.307262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.307273 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:04Z","lastTransitionTime":"2026-01-29T03:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.410283 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.410366 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.410394 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.410430 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.410456 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:04Z","lastTransitionTime":"2026-01-29T03:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.513080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.513130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.513144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.513164 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.513176 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:04Z","lastTransitionTime":"2026-01-29T03:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.549894 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/0.log" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.554757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.555676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.568278 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 19:54:57.84901461 +0000 UTC Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.574205 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.608724 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:02Z\\\",\\\"message\\\":\\\":02.650989 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 03:28:02.651440 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 03:28:02.651467 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:02.651003 6031 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:02.651871 6031 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.651875 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:02.651966 6031 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652168 6031 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652370 6031 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652784 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:02.652806 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:02.652848 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:02.652868 6031 factory.go:656] Stopping watch factory\\\\nI0129 03:28:02.652897 6031 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.615784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.615984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.616119 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.616258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.616392 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:04Z","lastTransitionTime":"2026-01-29T03:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.630105 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.660922 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.691467 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.708910 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.719294 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.719328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.719337 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.719351 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.719361 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:04Z","lastTransitionTime":"2026-01-29T03:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.726083 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.741965 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.755260 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.769022 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.791176 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.806162 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.822615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.822666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.822683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.822704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.822722 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:04Z","lastTransitionTime":"2026-01-29T03:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.826944 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.841232 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.854137 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:04Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.925935 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.925990 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.926008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.926037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:04 crc kubenswrapper[4707]: I0129 03:28:04.926054 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:04Z","lastTransitionTime":"2026-01-29T03:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.029130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.029188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.029200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.029219 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.029235 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.132896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.132948 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.132970 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.132996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.133015 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.236033 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.236092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.236105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.236130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.236149 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.243468 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.243577 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.243469 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:05 crc kubenswrapper[4707]: E0129 03:28:05.243660 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:05 crc kubenswrapper[4707]: E0129 03:28:05.243787 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:05 crc kubenswrapper[4707]: E0129 03:28:05.243906 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.339236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.339639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.339735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.339878 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.339993 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.443053 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.443117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.443130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.443151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.443165 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.546153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.546236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.546255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.546280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.546300 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.564844 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/1.log" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.565003 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m"] Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.566679 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.568145 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/0.log" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.569383 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 06:11:03.546536497 +0000 UTC Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.574368 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.575853 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.578433 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef" exitCode=1 Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.578494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.578604 4707 scope.go:117] "RemoveContainer" containerID="b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.579797 4707 scope.go:117] "RemoveContainer" containerID="25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef" Jan 29 03:28:05 crc kubenswrapper[4707]: E0129 03:28:05.580050 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.594361 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.617693 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.636469 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.649829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.649897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.649918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.649949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.649969 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.663075 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.684145 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.699285 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.716350 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.734594 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.740279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqhp\" (UniqueName: \"kubernetes.io/projected/91cba8d8-e784-454a-8397-936cb3a94b79-kube-api-access-2dqhp\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.740400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91cba8d8-e784-454a-8397-936cb3a94b79-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.740457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91cba8d8-e784-454a-8397-936cb3a94b79-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.740529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91cba8d8-e784-454a-8397-936cb3a94b79-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.753220 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.753287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.753306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.753333 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.753353 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.762080 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.780283 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.798614 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.816177 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.832367 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.842210 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqhp\" (UniqueName: \"kubernetes.io/projected/91cba8d8-e784-454a-8397-936cb3a94b79-kube-api-access-2dqhp\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.842288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91cba8d8-e784-454a-8397-936cb3a94b79-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.842323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91cba8d8-e784-454a-8397-936cb3a94b79-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.842385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91cba8d8-e784-454a-8397-936cb3a94b79-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.843285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91cba8d8-e784-454a-8397-936cb3a94b79-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.843631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91cba8d8-e784-454a-8397-936cb3a94b79-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.848969 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.850613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91cba8d8-e784-454a-8397-936cb3a94b79-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.856073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.856116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.856130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.856153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.856168 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.863698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqhp\" (UniqueName: \"kubernetes.io/projected/91cba8d8-e784-454a-8397-936cb3a94b79-kube-api-access-2dqhp\") pod \"ovnkube-control-plane-749d76644c-s2v9m\" (UID: \"91cba8d8-e784-454a-8397-936cb3a94b79\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.870445 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:02Z\\\",\\\"message\\\":\\\":02.650989 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 03:28:02.651440 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 03:28:02.651467 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:02.651003 6031 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:02.651871 6031 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.651875 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:02.651966 6031 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652168 6031 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652370 6031 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652784 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:02.652806 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:02.652848 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:02.652868 6031 factory.go:656] Stopping watch factory\\\\nI0129 03:28:02.652897 6031 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.890091 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.902701 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.911907 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: W0129 03:28:05.918815 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91cba8d8_e784_454a_8397_936cb3a94b79.slice/crio-2ed0c96af354a2f21253a0c0e44e24f7ba2d195e82fc3b873eda03cbb7efe570 WatchSource:0}: Error finding container 2ed0c96af354a2f21253a0c0e44e24f7ba2d195e82fc3b873eda03cbb7efe570: Status 404 returned error can't find the container with id 2ed0c96af354a2f21253a0c0e44e24f7ba2d195e82fc3b873eda03cbb7efe570 Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.945441 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.963988 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.964029 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.964042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.964058 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:05 crc kubenswrapper[4707]: I0129 03:28:05.964071 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:05Z","lastTransitionTime":"2026-01-29T03:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:05.999920 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:05Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.023216 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.038524 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.050879 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.065474 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.066711 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.066755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.066768 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.066793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.066810 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.085575 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:02Z\\\",\\\"message\\\":\\\":02.650989 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 03:28:02.651440 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 03:28:02.651467 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:02.651003 6031 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:02.651871 6031 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.651875 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:02.651966 6031 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652168 6031 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652370 6031 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652784 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:02.652806 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:02.652848 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:02.652868 6031 factory.go:656] Stopping watch factory\\\\nI0129 03:28:02.652897 6031 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:04Z\\\",\\\"message\\\":\\\"le:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 03:28:04.726562 6163 obj_retry.go:551] Creating *factory.egressNode crc took: 11.457004ms\\\\nI0129 03:28:04.726598 6163 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 03:28:04.726642 6163 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 03:28:04.726665 6163 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:04.726690 6163 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:04.726753 6163 factory.go:656] Stopping watch factory\\\\nI0129 03:28:04.726808 6163 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:04.726855 6163 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:04.726899 6163 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 03:28:04.726979 6163 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 03:28:04.727010 6163 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:04.727044 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 03:28:04.727176 6163 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.099713 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.116714 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.134092 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.153202 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.169917 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.170009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.170024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.170044 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.170056 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.173231 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.190144 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.203212 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.218025 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.272191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.272244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.272256 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.272281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.272294 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.376375 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.376443 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.376453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.376472 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.376486 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.478863 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.478924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.478941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.478959 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.478971 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.569993 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 02:47:38.465019599 +0000 UTC Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.581227 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.581276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.581285 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.581303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.581315 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.585365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" event={"ID":"91cba8d8-e784-454a-8397-936cb3a94b79","Type":"ContainerStarted","Data":"927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.585412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" event={"ID":"91cba8d8-e784-454a-8397-936cb3a94b79","Type":"ContainerStarted","Data":"5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.585428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" event={"ID":"91cba8d8-e784-454a-8397-936cb3a94b79","Type":"ContainerStarted","Data":"2ed0c96af354a2f21253a0c0e44e24f7ba2d195e82fc3b873eda03cbb7efe570"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.589856 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/1.log" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.600120 4707 scope.go:117] "RemoveContainer" containerID="25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef" Jan 29 03:28:06 crc kubenswrapper[4707]: E0129 03:28:06.600459 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.611952 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.632743 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.644900 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.661131 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.675041 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.684681 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.684746 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.684760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.684781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.684794 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.693862 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.720233 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.735629 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.749414 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.761101 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.775836 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.787365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.787408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.787423 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.787446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.787460 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.800943 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8b048cb35b7e5cefd10c1eb01fc489a72bc414f3e1284a62668892a9bd6b96b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:02Z\\\",\\\"message\\\":\\\":02.650989 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 03:28:02.651440 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 03:28:02.651467 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:02.651003 6031 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:02.651871 6031 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.651875 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:02.651966 6031 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652168 6031 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652370 6031 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 03:28:02.652784 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:02.652806 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:02.652848 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:02.652868 6031 factory.go:656] Stopping watch factory\\\\nI0129 03:28:02.652897 6031 ovnkube.go:599] Stopped ovnkube\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:04Z\\\",\\\"message\\\":\\\"le:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 03:28:04.726562 6163 obj_retry.go:551] Creating *factory.egressNode crc took: 11.457004ms\\\\nI0129 03:28:04.726598 6163 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 03:28:04.726642 6163 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 03:28:04.726665 6163 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:04.726690 6163 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:04.726753 6163 factory.go:656] Stopping watch factory\\\\nI0129 03:28:04.726808 6163 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:04.726855 6163 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:04.726899 6163 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 03:28:04.726979 6163 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 03:28:04.727010 6163 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:04.727044 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 03:28:04.727176 6163 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.818162 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.835985 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.855029 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.869808 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.891113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.891162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.891176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.891212 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.891228 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.892225 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.905945 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.924209 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.939268 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.954236 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.973373 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.990756 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:06Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.994788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.994857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.994870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.994892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:06 crc kubenswrapper[4707]: I0129 03:28:06.994906 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:06Z","lastTransitionTime":"2026-01-29T03:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.018008 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:04Z\\\",\\\"message\\\":\\\"le:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 03:28:04.726562 6163 obj_retry.go:551] Creating *factory.egressNode crc took: 11.457004ms\\\\nI0129 03:28:04.726598 6163 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 03:28:04.726642 6163 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 03:28:04.726665 6163 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:04.726690 6163 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:04.726753 6163 factory.go:656] Stopping watch factory\\\\nI0129 03:28:04.726808 6163 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:04.726855 6163 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:04.726899 6163 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 03:28:04.726979 6163 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 03:28:04.727010 6163 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:04.727044 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 03:28:04.727176 6163 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.034074 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.050877 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.066455 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.081716 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.098330 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.098401 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.098420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.098448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.098468 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:07Z","lastTransitionTime":"2026-01-29T03:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.103361 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.108648 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-652c6"] Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.109342 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:07 crc kubenswrapper[4707]: E0129 03:28:07.109429 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.117678 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.135276 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.147746 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.161502 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.192366 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:04Z\\\",\\\"message\\\":\\\"le:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 03:28:04.726562 6163 obj_retry.go:551] Creating *factory.egressNode crc took: 11.457004ms\\\\nI0129 03:28:04.726598 6163 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 03:28:04.726642 6163 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 03:28:04.726665 6163 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:04.726690 6163 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:04.726753 6163 factory.go:656] Stopping watch factory\\\\nI0129 03:28:04.726808 6163 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:04.726855 6163 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:04.726899 6163 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 03:28:04.726979 6163 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 03:28:04.727010 6163 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:04.727044 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 03:28:04.727176 6163 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.202052 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.202109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.202151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.202177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.202195 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:07Z","lastTransitionTime":"2026-01-29T03:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.208937 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.225715 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.243102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.243102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.243174 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:07 crc kubenswrapper[4707]: E0129 03:28:07.243640 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:07 crc kubenswrapper[4707]: E0129 03:28:07.243721 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:07 crc kubenswrapper[4707]: E0129 03:28:07.243818 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.254323 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.259026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.259139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmnt5\" (UniqueName: \"kubernetes.io/projected/08dd724c-b8cc-45c6-9a61-13643a1c0d75-kube-api-access-zmnt5\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.273470 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.295259 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.308398 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.308463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.308482 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.308508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.308527 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:07Z","lastTransitionTime":"2026-01-29T03:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.322326 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.337457 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.349351 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.359939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.360034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmnt5\" (UniqueName: \"kubernetes.io/projected/08dd724c-b8cc-45c6-9a61-13643a1c0d75-kube-api-access-zmnt5\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:07 crc kubenswrapper[4707]: E0129 03:28:07.360127 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:07 crc kubenswrapper[4707]: E0129 03:28:07.360249 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs podName:08dd724c-b8cc-45c6-9a61-13643a1c0d75 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:07.860217711 +0000 UTC m=+41.344446656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs") pod "network-metrics-daemon-652c6" (UID: "08dd724c-b8cc-45c6-9a61-13643a1c0d75") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.373956 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.389669 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.390381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmnt5\" (UniqueName: \"kubernetes.io/projected/08dd724c-b8cc-45c6-9a61-13643a1c0d75-kube-api-access-zmnt5\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.410439 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.414644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.414730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.414744 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.414766 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.414779 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:07Z","lastTransitionTime":"2026-01-29T03:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.427218 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.443432 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.464081 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.478999 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.501304 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.518067 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.518123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.518136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.518155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.518169 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:07Z","lastTransitionTime":"2026-01-29T03:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.523967 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:04Z\\\",\\\"message\\\":\\\"le:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 03:28:04.726562 6163 obj_retry.go:551] Creating *factory.egressNode crc took: 11.457004ms\\\\nI0129 03:28:04.726598 6163 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 03:28:04.726642 6163 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 03:28:04.726665 6163 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:04.726690 6163 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:04.726753 6163 factory.go:656] Stopping watch factory\\\\nI0129 03:28:04.726808 6163 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:04.726855 6163 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:04.726899 6163 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 03:28:04.726979 6163 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 03:28:04.727010 6163 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:04.727044 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 03:28:04.727176 6163 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.539215 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.563418 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.570468 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 15:10:20.966009014 +0000 UTC Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.583814 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.606256 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.620950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.621023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.621043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.621069 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.621087 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:07Z","lastTransitionTime":"2026-01-29T03:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.635037 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.656083 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.675206 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.691162 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.708842 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.724042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.724125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.724140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.724159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.724174 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:07Z","lastTransitionTime":"2026-01-29T03:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.742681 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.765429 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.791166 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.805108 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.822931 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.827672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.827717 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.827728 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.827751 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.827766 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:07Z","lastTransitionTime":"2026-01-29T03:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.844236 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:07Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.868025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:07 crc kubenswrapper[4707]: E0129 03:28:07.868205 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:07 crc kubenswrapper[4707]: E0129 03:28:07.868306 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs podName:08dd724c-b8cc-45c6-9a61-13643a1c0d75 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:08.868279507 +0000 UTC m=+42.352508412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs") pod "network-metrics-daemon-652c6" (UID: "08dd724c-b8cc-45c6-9a61-13643a1c0d75") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.931411 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.931518 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.931584 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.931620 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:07 crc kubenswrapper[4707]: I0129 03:28:07.931672 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:07Z","lastTransitionTime":"2026-01-29T03:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.035988 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.036074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.036116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.036156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.036182 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.140557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.140617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.140626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.140646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.140658 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.242760 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:08 crc kubenswrapper[4707]: E0129 03:28:08.242976 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.244866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.244925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.244950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.244979 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.245001 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.348123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.348179 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.348189 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.348208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.348221 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.451621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.451679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.451694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.451716 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.451731 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.555714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.555790 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.555809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.555836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.555853 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.571108 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 13:28:56.421177719 +0000 UTC Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.659384 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.659440 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.659457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.659479 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.659497 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.762977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.763052 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.763072 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.763097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.763117 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.866476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.866646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.866672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.866705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.866729 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.882434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:08 crc kubenswrapper[4707]: E0129 03:28:08.882647 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:08 crc kubenswrapper[4707]: E0129 03:28:08.882727 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs podName:08dd724c-b8cc-45c6-9a61-13643a1c0d75 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:10.882702678 +0000 UTC m=+44.366931623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs") pod "network-metrics-daemon-652c6" (UID: "08dd724c-b8cc-45c6-9a61-13643a1c0d75") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.970978 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.971043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.971066 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.971091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:08 crc kubenswrapper[4707]: I0129 03:28:08.971109 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:08Z","lastTransitionTime":"2026-01-29T03:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.074451 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.074519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.074579 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.074605 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.074629 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:09Z","lastTransitionTime":"2026-01-29T03:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.178163 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.178214 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.178237 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.178263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.178281 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:09Z","lastTransitionTime":"2026-01-29T03:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.243748 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.243787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.243778 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:09 crc kubenswrapper[4707]: E0129 03:28:09.243962 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:09 crc kubenswrapper[4707]: E0129 03:28:09.244093 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:09 crc kubenswrapper[4707]: E0129 03:28:09.244491 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.281608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.281651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.281666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.281684 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.281696 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:09Z","lastTransitionTime":"2026-01-29T03:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.384709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.384764 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.384783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.384808 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.384826 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:09Z","lastTransitionTime":"2026-01-29T03:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.488246 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.488314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.488332 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.488359 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.488378 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:09Z","lastTransitionTime":"2026-01-29T03:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.572068 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 12:12:13.564599133 +0000 UTC Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.592357 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.592415 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.592427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.592447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.592462 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:09Z","lastTransitionTime":"2026-01-29T03:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.701747 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.701824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.701844 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.701874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.701893 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:09Z","lastTransitionTime":"2026-01-29T03:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.805196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.805266 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.805285 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.805312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.805357 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:09Z","lastTransitionTime":"2026-01-29T03:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.908074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.908736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.908774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.908810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:09 crc kubenswrapper[4707]: I0129 03:28:09.908835 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:09Z","lastTransitionTime":"2026-01-29T03:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.012627 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.012753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.012783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.012852 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.012871 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.116100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.116358 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.116463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.116572 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.116660 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.218770 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.218833 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.218847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.218863 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.218872 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.243441 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:10 crc kubenswrapper[4707]: E0129 03:28:10.243675 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.322056 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.322109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.322127 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.322153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.322172 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.424783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.424842 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.424857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.424874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.424885 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.527874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.527909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.527925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.527945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.527958 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.572908 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 22:19:51.429951251 +0000 UTC Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.630576 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.630621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.630640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.630663 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.630681 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.732984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.733024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.733036 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.733054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.733066 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.836437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.836472 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.836485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.836520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.836561 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.903232 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:10 crc kubenswrapper[4707]: E0129 03:28:10.903443 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:10 crc kubenswrapper[4707]: E0129 03:28:10.903573 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs podName:08dd724c-b8cc-45c6-9a61-13643a1c0d75 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:14.903517319 +0000 UTC m=+48.387746224 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs") pod "network-metrics-daemon-652c6" (UID: "08dd724c-b8cc-45c6-9a61-13643a1c0d75") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.939940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.940011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.940029 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.940052 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:10 crc kubenswrapper[4707]: I0129 03:28:10.940070 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:10Z","lastTransitionTime":"2026-01-29T03:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.042972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.043036 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.043054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.043079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.043096 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.146243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.146299 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.146323 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.146346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.146364 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.243509 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.243594 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.243613 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:11 crc kubenswrapper[4707]: E0129 03:28:11.243777 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:11 crc kubenswrapper[4707]: E0129 03:28:11.243916 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:11 crc kubenswrapper[4707]: E0129 03:28:11.244035 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.251581 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.251649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.251671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.251700 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.251718 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.354982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.355034 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.355089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.355110 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.355127 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.458216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.458303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.458322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.458348 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.458368 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.562890 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.562945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.562958 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.562985 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.562998 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.573628 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 02:41:13.283489355 +0000 UTC Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.666105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.666149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.666157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.666210 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.666222 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.769835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.769919 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.769944 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.769977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.769997 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.872964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.873057 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.873077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.873116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.873134 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.976996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.977093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.977121 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.977148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:11 crc kubenswrapper[4707]: I0129 03:28:11.977165 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:11Z","lastTransitionTime":"2026-01-29T03:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.080966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.081040 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.081059 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.081092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.081114 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:12Z","lastTransitionTime":"2026-01-29T03:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.188977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.189077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.189105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.189139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.189161 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:12Z","lastTransitionTime":"2026-01-29T03:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.242690 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:12 crc kubenswrapper[4707]: E0129 03:28:12.242847 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.293119 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.293187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.293227 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.293263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.293284 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:12Z","lastTransitionTime":"2026-01-29T03:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.395903 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.395960 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.395972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.395986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.395995 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:12Z","lastTransitionTime":"2026-01-29T03:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.499213 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.499265 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.499277 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.499297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.499309 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:12Z","lastTransitionTime":"2026-01-29T03:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.574569 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 04:43:23.25778815 +0000 UTC Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.601909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.601968 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.601981 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.602001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.602014 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:12Z","lastTransitionTime":"2026-01-29T03:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.705213 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.705262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.705272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.705297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.705310 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:12Z","lastTransitionTime":"2026-01-29T03:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.808485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.808562 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.808573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.808592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.808604 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:12Z","lastTransitionTime":"2026-01-29T03:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.910871 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.910922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.910936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.910957 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:12 crc kubenswrapper[4707]: I0129 03:28:12.910969 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:12Z","lastTransitionTime":"2026-01-29T03:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.014250 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.014317 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.014328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.014346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.014359 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.117626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.117680 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.117693 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.117714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.117729 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.220897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.220966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.220980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.221007 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.221023 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.243639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.243795 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:13 crc kubenswrapper[4707]: E0129 03:28:13.243880 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.243660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:13 crc kubenswrapper[4707]: E0129 03:28:13.244058 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:13 crc kubenswrapper[4707]: E0129 03:28:13.244228 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.323670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.323721 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.323732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.323751 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.323764 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.330907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.330956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.330967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.330986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.331000 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: E0129 03:28:13.348025 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:13Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.352915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.353007 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.353026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.353048 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.353065 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: E0129 03:28:13.369904 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:13Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.374241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.374297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.374308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.374329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.374343 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: E0129 03:28:13.392393 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:13Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.396816 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.396870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.396920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.396944 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.396969 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: E0129 03:28:13.416307 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:13Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.420825 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.420877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.420886 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.420904 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.420913 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: E0129 03:28:13.435931 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:13Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:13 crc kubenswrapper[4707]: E0129 03:28:13.436054 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.438104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.438144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.438156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.438176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.438189 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.541242 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.541286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.541296 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.541315 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.541330 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.575394 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 22:11:39.328961569 +0000 UTC Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.644228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.644282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.644293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.644314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.644328 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.747761 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.747801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.747812 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.747829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.747844 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.851500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.851809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.851926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.852090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.852245 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.955750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.955805 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.955815 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.955831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:13 crc kubenswrapper[4707]: I0129 03:28:13.955841 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:13Z","lastTransitionTime":"2026-01-29T03:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.058270 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.059111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.059208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.059305 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.059387 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:14Z","lastTransitionTime":"2026-01-29T03:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.163022 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.163106 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.163121 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.163142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.163154 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:14Z","lastTransitionTime":"2026-01-29T03:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.242716 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:14 crc kubenswrapper[4707]: E0129 03:28:14.242967 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.266140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.266208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.266241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.266262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.266276 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:14Z","lastTransitionTime":"2026-01-29T03:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.369566 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.369618 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.369630 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.369651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.369665 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:14Z","lastTransitionTime":"2026-01-29T03:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.473180 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.473248 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.473261 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.473285 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.473300 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:14Z","lastTransitionTime":"2026-01-29T03:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.688764 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 12:43:18.201616868 +0000 UTC Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.691820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.691886 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.691906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.691933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.691949 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:14Z","lastTransitionTime":"2026-01-29T03:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.794450 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.794500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.794516 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.794552 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.794566 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:14Z","lastTransitionTime":"2026-01-29T03:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.901497 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.901577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.901589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.901614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.901627 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:14Z","lastTransitionTime":"2026-01-29T03:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:14 crc kubenswrapper[4707]: I0129 03:28:14.991518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:14 crc kubenswrapper[4707]: E0129 03:28:14.991759 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:14 crc kubenswrapper[4707]: E0129 03:28:14.991843 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs podName:08dd724c-b8cc-45c6-9a61-13643a1c0d75 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:22.991823015 +0000 UTC m=+56.476051920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs") pod "network-metrics-daemon-652c6" (UID: "08dd724c-b8cc-45c6-9a61-13643a1c0d75") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.004445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.004492 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.004501 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.004515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.004524 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.107908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.107953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.107966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.107983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.107994 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.212467 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.212683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.212710 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.212743 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.212764 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.243019 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.243028 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:15 crc kubenswrapper[4707]: E0129 03:28:15.243321 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.243104 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:15 crc kubenswrapper[4707]: E0129 03:28:15.243531 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:15 crc kubenswrapper[4707]: E0129 03:28:15.243776 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.315862 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.315918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.315936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.315964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.315984 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.419736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.419778 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.419790 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.419809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.419821 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.524362 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.524435 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.524459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.524490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.524517 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.627581 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.627644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.627692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.627729 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.627742 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.689851 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 17:47:31.373421814 +0000 UTC Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.730808 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.730887 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.730910 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.730940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.730962 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.833578 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.833625 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.833639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.833655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.833667 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.936218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.936264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.936281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.936307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:15 crc kubenswrapper[4707]: I0129 03:28:15.936323 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:15Z","lastTransitionTime":"2026-01-29T03:28:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.039418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.039486 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.039506 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.039532 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.039579 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.142327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.142394 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.142413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.142440 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.142457 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.242658 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:16 crc kubenswrapper[4707]: E0129 03:28:16.242934 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.246057 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.246167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.246194 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.246227 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.246252 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.349314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.349468 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.349490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.349515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.349570 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.452771 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.452819 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.452831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.452851 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.452866 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.555196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.555273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.555293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.555322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.555341 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.658836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.658915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.658935 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.658964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.658982 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.690929 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 10:35:13.534752521 +0000 UTC Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.762866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.762913 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.762927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.762948 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.762962 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.865657 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.865796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.865817 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.865843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.865860 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.968886 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.968971 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.969002 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.969042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:16 crc kubenswrapper[4707]: I0129 03:28:16.969066 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:16Z","lastTransitionTime":"2026-01-29T03:28:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.071897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.072005 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.072024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.072050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.072074 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:17Z","lastTransitionTime":"2026-01-29T03:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.175831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.175904 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.175922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.175950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.175972 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:17Z","lastTransitionTime":"2026-01-29T03:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.243516 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:17 crc kubenswrapper[4707]: E0129 03:28:17.243700 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.243836 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.244016 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:17 crc kubenswrapper[4707]: E0129 03:28:17.244217 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:17 crc kubenswrapper[4707]: E0129 03:28:17.244428 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.264696 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.279804 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.279872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.279894 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.279944 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.279959 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:17Z","lastTransitionTime":"2026-01-29T03:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.288766 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.308033 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.330449 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.352259 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.368307 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.383475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.383582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.383601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.383621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.383636 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:17Z","lastTransitionTime":"2026-01-29T03:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.385204 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.401830 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.414822 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.438318 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.454251 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.469759 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.483026 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.486319 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.486383 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.486401 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.486444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.486461 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:17Z","lastTransitionTime":"2026-01-29T03:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.494672 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.512105 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.540017 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:04Z\\\",\\\"message\\\":\\\"le:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 03:28:04.726562 6163 obj_retry.go:551] Creating *factory.egressNode crc took: 11.457004ms\\\\nI0129 03:28:04.726598 6163 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 03:28:04.726642 6163 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 03:28:04.726665 6163 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:04.726690 6163 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:04.726753 6163 factory.go:656] Stopping watch factory\\\\nI0129 03:28:04.726808 6163 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:04.726855 6163 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:04.726899 6163 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 03:28:04.726979 6163 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 03:28:04.727010 6163 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:04.727044 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 03:28:04.727176 6163 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.554259 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:17Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.589884 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.589950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.589966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.589988 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.590003 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:17Z","lastTransitionTime":"2026-01-29T03:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.691115 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:00:38.251197679 +0000 UTC Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.694062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.694139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.694155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.694180 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.694197 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:17Z","lastTransitionTime":"2026-01-29T03:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.798325 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.798405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.798423 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.798454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.798474 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:17Z","lastTransitionTime":"2026-01-29T03:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.902027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.902091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.902144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.902167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:17 crc kubenswrapper[4707]: I0129 03:28:17.902180 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:17Z","lastTransitionTime":"2026-01-29T03:28:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.005512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.005625 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.005650 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.005694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.005720 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.109459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.109515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.109525 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.109567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.109581 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.213002 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.213050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.213063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.213080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.213093 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.243589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.243779 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.316335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.316398 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.316414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.316433 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.316452 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.419640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.419701 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.419714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.419735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.419746 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.433870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434040 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:28:50.43401336 +0000 UTC m=+83.918242275 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.434034 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.434091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.434159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.434188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434195 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434225 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434247 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434318 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:50.434298779 +0000 UTC m=+83.918527724 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434312 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434321 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434409 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434427 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:50.434400982 +0000 UTC m=+83.918629887 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434426 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434558 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:50.434514125 +0000 UTC m=+83.918743260 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434558 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:18 crc kubenswrapper[4707]: E0129 03:28:18.434635 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:50.434609818 +0000 UTC m=+83.918838733 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.522893 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.522972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.522985 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.523006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.523021 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.626265 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.626328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.626343 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.626371 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.626391 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.691736 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 17:29:26.549582295 +0000 UTC Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.729573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.729633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.729646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.729668 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.729684 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.833137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.833218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.833235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.833267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.833289 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.937581 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.937654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.937665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.937683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:18 crc kubenswrapper[4707]: I0129 03:28:18.937696 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:18Z","lastTransitionTime":"2026-01-29T03:28:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.040460 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.040578 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.040602 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.040635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.040658 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.144499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.144627 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.144655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.144691 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.144716 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.243602 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.243690 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.243743 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:19 crc kubenswrapper[4707]: E0129 03:28:19.243860 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:19 crc kubenswrapper[4707]: E0129 03:28:19.244331 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:19 crc kubenswrapper[4707]: E0129 03:28:19.245059 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.245657 4707 scope.go:117] "RemoveContainer" containerID="25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.248961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.249027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.249047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.249075 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.249101 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.352942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.353436 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.353450 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.353469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.353482 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.384333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.399057 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.407453 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.425817 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.440630 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.456872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.456942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.456955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.457003 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.457020 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.459293 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.485668 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.501044 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.524632 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.537386 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.551165 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.559501 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.559586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.559608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.559632 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.559652 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.571103 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.588651 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.608315 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:04Z\\\",\\\"message\\\":\\\"le:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 03:28:04.726562 6163 obj_retry.go:551] Creating *factory.egressNode crc took: 11.457004ms\\\\nI0129 03:28:04.726598 6163 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 03:28:04.726642 6163 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 03:28:04.726665 6163 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:04.726690 6163 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:04.726753 6163 factory.go:656] Stopping watch factory\\\\nI0129 03:28:04.726808 6163 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:04.726855 6163 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:04.726899 6163 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 03:28:04.726979 6163 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 03:28:04.727010 6163 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:04.727044 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 03:28:04.727176 6163 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.639842 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.662134 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.662198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.662209 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.662230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.662245 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.672021 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.691572 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.691847 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 13:14:43.651535815 +0000 UTC Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.712833 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/1.log" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.715158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.715789 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.718194 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.741375 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.756818 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.764339 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.764401 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.764413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.764437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.764453 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.781677 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.798234 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.813169 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.830389 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.843263 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.862895 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.866420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.866465 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.866480 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.866499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.866511 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.877282 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.897157 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:04Z\\\",\\\"message\\\":\\\"le:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 03:28:04.726562 6163 obj_retry.go:551] Creating *factory.egressNode crc took: 11.457004ms\\\\nI0129 03:28:04.726598 6163 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 03:28:04.726642 6163 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 03:28:04.726665 6163 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:04.726690 6163 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:04.726753 6163 factory.go:656] Stopping watch factory\\\\nI0129 03:28:04.726808 6163 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:04.726855 6163 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:04.726899 6163 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 03:28:04.726979 6163 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 03:28:04.727010 6163 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:04.727044 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 03:28:04.727176 6163 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.910143 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.924491 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.941847 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.960499 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.969295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.969329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.969340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.969353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.969362 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:19Z","lastTransitionTime":"2026-01-29T03:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.975259 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:19 crc kubenswrapper[4707]: I0129 03:28:19.998918 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:19Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.015382 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.025693 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.042190 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.071769 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.071843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.071863 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.071892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.071910 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:20Z","lastTransitionTime":"2026-01-29T03:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.174782 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.175163 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.175289 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.175418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.175583 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:20Z","lastTransitionTime":"2026-01-29T03:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.243043 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:20 crc kubenswrapper[4707]: E0129 03:28:20.243205 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.277920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.278703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.278814 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.278884 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.278948 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:20Z","lastTransitionTime":"2026-01-29T03:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.382395 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.382705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.382812 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.382918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.383019 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:20Z","lastTransitionTime":"2026-01-29T03:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.488188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.488264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.488286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.488319 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.488342 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:20Z","lastTransitionTime":"2026-01-29T03:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.591470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.591514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.591525 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.591577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.591591 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:20Z","lastTransitionTime":"2026-01-29T03:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.692283 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 20:39:18.058132169 +0000 UTC Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.694198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.694298 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.694319 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.694402 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.694422 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:20Z","lastTransitionTime":"2026-01-29T03:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.720671 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/2.log" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.722246 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/1.log" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.726169 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7" exitCode=1 Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.726262 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.726368 4707 scope.go:117] "RemoveContainer" containerID="25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.727712 4707 scope.go:117] "RemoveContainer" containerID="e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7" Jan 29 03:28:20 crc kubenswrapper[4707]: E0129 03:28:20.728046 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.751949 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.771339 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.791772 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.798013 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.798054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.798066 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.798086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.798100 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:20Z","lastTransitionTime":"2026-01-29T03:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.825969 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25c02b58483fccd1abff4537d26af0b91c60f17acfda1d0243f3e82ea1cf0cef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:04Z\\\",\\\"message\\\":\\\"le:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 03:28:04.726562 6163 obj_retry.go:551] Creating *factory.egressNode crc took: 11.457004ms\\\\nI0129 03:28:04.726598 6163 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 03:28:04.726642 6163 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 03:28:04.726665 6163 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:04.726690 6163 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:04.726753 6163 factory.go:656] Stopping watch factory\\\\nI0129 03:28:04.726808 6163 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:04.726855 6163 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:04.726899 6163 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 03:28:04.726979 6163 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 03:28:04.727010 6163 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:04.727044 6163 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 03:28:04.727176 6163 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:20Z\\\",\\\"message\\\":\\\"or removal\\\\nI0129 03:28:20.294458 6383 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0129 03:28:20.294463 6383 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0129 03:28:20.294480 6383 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:20.294484 6383 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:20.294500 6383 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0129 03:28:20.294525 6383 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0129 03:28:20.294598 6383 factory.go:656] Stopping watch factory\\\\nI0129 03:28:20.294613 6383 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:20.294646 6383 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:20.294655 6383 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:20.294663 6383 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:20.294669 6383 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0129 03:28:20.294674 6383 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0129 03:28:20.294679 6383 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:20.294684 6383 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:20.294692 6383 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0129 03:28:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.844962 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.868868 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.889307 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.901446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.901509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.901526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.901577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.901591 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:20Z","lastTransitionTime":"2026-01-29T03:28:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.909707 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.931411 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.951675 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.966942 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.982376 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:20 crc kubenswrapper[4707]: I0129 03:28:20.996395 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:20Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.004116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.004218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.004238 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.004265 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.004284 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.017298 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.033003 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.065431 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.081743 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.099365 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.107902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.108105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.108285 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.108443 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.108624 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.212033 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.212081 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.212098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.212125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.212141 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.243235 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:21 crc kubenswrapper[4707]: E0129 03:28:21.243399 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.243694 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.243930 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:21 crc kubenswrapper[4707]: E0129 03:28:21.244126 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:21 crc kubenswrapper[4707]: E0129 03:28:21.244141 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.315093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.315145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.315159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.315182 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.315196 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.419117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.419210 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.419230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.419257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.419277 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.522793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.522851 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.522870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.522894 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.522912 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.626161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.626239 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.626257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.626287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.626309 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.693272 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 19:37:21.278522841 +0000 UTC Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.729753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.729821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.729843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.729872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.729894 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.732970 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/2.log" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.738904 4707 scope.go:117] "RemoveContainer" containerID="e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7" Jan 29 03:28:21 crc kubenswrapper[4707]: E0129 03:28:21.739370 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.757355 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.773194 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.792841 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.808884 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.822884 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.832868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.832953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.832967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.832994 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.833010 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.836524 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.848208 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.866812 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.881909 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.913358 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.929355 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.937205 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.937317 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.937829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.937920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.938222 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:21Z","lastTransitionTime":"2026-01-29T03:28:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.962947 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:20Z\\\",\\\"message\\\":\\\"or removal\\\\nI0129 03:28:20.294458 6383 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0129 03:28:20.294463 6383 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0129 03:28:20.294480 6383 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:20.294484 6383 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:20.294500 6383 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0129 03:28:20.294525 6383 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0129 03:28:20.294598 6383 factory.go:656] Stopping watch factory\\\\nI0129 03:28:20.294613 6383 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:20.294646 6383 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:20.294655 6383 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:20.294663 6383 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:20.294669 6383 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0129 03:28:20.294674 6383 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0129 03:28:20.294679 6383 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:20.294684 6383 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:20.294692 6383 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0129 03:28:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.979056 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:21 crc kubenswrapper[4707]: I0129 03:28:21.994265 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:21Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.010991 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:22Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.025715 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:22Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.042290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.042355 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.042373 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.042398 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.042414 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.047502 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:22Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.066705 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:22Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.145468 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.145580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.145604 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.145632 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.145656 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.243215 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:22 crc kubenswrapper[4707]: E0129 03:28:22.243410 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.249090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.249183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.249202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.249223 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.249236 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.352494 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.352630 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.352659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.352695 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.352736 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.455975 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.456030 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.456047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.456069 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.456083 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.559172 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.559278 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.559294 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.559315 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.559327 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.662192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.662256 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.662273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.662302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.662320 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.693768 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 00:06:36.173092654 +0000 UTC Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.764967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.765051 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.765078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.765112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.765135 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.868462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.868578 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.868601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.868626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.868644 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.972110 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.972195 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.972217 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.972260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.972279 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:22Z","lastTransitionTime":"2026-01-29T03:28:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:22 crc kubenswrapper[4707]: I0129 03:28:22.992916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:22 crc kubenswrapper[4707]: E0129 03:28:22.993336 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:22 crc kubenswrapper[4707]: E0129 03:28:22.993600 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs podName:08dd724c-b8cc-45c6-9a61-13643a1c0d75 nodeName:}" failed. No retries permitted until 2026-01-29 03:28:38.993509713 +0000 UTC m=+72.477738668 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs") pod "network-metrics-daemon-652c6" (UID: "08dd724c-b8cc-45c6-9a61-13643a1c0d75") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.076226 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.076287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.076305 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.076329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.076346 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.179144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.179440 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.179505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.179621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.179768 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.243019 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:23 crc kubenswrapper[4707]: E0129 03:28:23.243262 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.243590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.243710 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:23 crc kubenswrapper[4707]: E0129 03:28:23.243885 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:23 crc kubenswrapper[4707]: E0129 03:28:23.244129 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.282232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.282313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.282334 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.282366 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.282385 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.385737 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.385798 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.385810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.385837 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.385852 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.488267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.488330 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.488351 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.488380 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.488407 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.592094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.592147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.592161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.592187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.592205 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.693979 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 11:04:35.700117375 +0000 UTC Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.695380 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.695413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.695422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.695439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.695452 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.798981 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.799017 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.799025 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.799042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.799054 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.803216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.803243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.803252 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.803266 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.803276 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: E0129 03:28:23.828102 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:23Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.833818 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.833875 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.833892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.833916 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.833934 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: E0129 03:28:23.852466 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:23Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.857939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.858039 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.858059 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.858086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.858135 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: E0129 03:28:23.879010 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:23Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.885230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.885284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.885294 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.885315 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.885329 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: E0129 03:28:23.905078 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:23Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.910096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.910147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.910157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.910181 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.910194 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:23 crc kubenswrapper[4707]: E0129 03:28:23.923862 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:23Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:23 crc kubenswrapper[4707]: E0129 03:28:23.924017 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.925713 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.925742 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.925753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.925770 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:23 crc kubenswrapper[4707]: I0129 03:28:23.925782 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:23Z","lastTransitionTime":"2026-01-29T03:28:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.028827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.028899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.028927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.028950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.028962 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.133975 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.134070 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.134120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.134145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.134198 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.237938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.238026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.238067 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.238091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.238106 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.243246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:24 crc kubenswrapper[4707]: E0129 03:28:24.243447 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.340884 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.340946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.340986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.341015 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.341031 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.443622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.443677 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.443691 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.443711 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.443726 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.547375 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.547929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.547948 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.547977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.547998 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.650692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.650761 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.650780 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.650807 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.650826 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.695025 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 13:47:57.324579538 +0000 UTC Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.753498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.753582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.753595 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.753617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.753635 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.857308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.857368 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.857386 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.857411 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.857430 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.960570 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.960632 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.960654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.960676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:24 crc kubenswrapper[4707]: I0129 03:28:24.960689 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:24Z","lastTransitionTime":"2026-01-29T03:28:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.064629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.064685 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.064704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.064731 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.064751 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.168905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.168973 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.168994 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.169024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.169042 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.242847 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.242969 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:25 crc kubenswrapper[4707]: E0129 03:28:25.243014 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:25 crc kubenswrapper[4707]: E0129 03:28:25.243181 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.243313 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:25 crc kubenswrapper[4707]: E0129 03:28:25.243428 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.272254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.272317 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.272335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.272364 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.272383 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.375899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.375967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.375985 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.376011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.376028 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.479491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.479557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.479567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.479586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.479598 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.582269 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.582323 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.582339 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.582366 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.582383 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.685828 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.685913 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.685931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.685962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.685982 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.696080 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 18:09:34.041269063 +0000 UTC Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.788498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.788577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.788592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.788623 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.788642 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.892014 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.892079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.892103 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.892132 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.892153 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.995426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.995472 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.995482 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.995500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:25 crc kubenswrapper[4707]: I0129 03:28:25.995514 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:25Z","lastTransitionTime":"2026-01-29T03:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.097968 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.098032 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.098054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.098083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.098102 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:26Z","lastTransitionTime":"2026-01-29T03:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.201452 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.201582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.201613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.201643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.201664 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:26Z","lastTransitionTime":"2026-01-29T03:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.242885 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:26 crc kubenswrapper[4707]: E0129 03:28:26.243111 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.305009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.305087 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.305113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.305149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.305167 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:26Z","lastTransitionTime":"2026-01-29T03:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.408039 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.408101 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.408124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.408155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.408177 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:26Z","lastTransitionTime":"2026-01-29T03:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.512024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.512092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.512114 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.512141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.512160 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:26Z","lastTransitionTime":"2026-01-29T03:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.614790 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.615064 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.615129 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.615192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.615251 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:26Z","lastTransitionTime":"2026-01-29T03:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.696352 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 16:08:50.284813257 +0000 UTC Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.717872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.717907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.717918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.717932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.717941 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:26Z","lastTransitionTime":"2026-01-29T03:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.821253 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.821324 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.821348 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.821381 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.821401 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:26Z","lastTransitionTime":"2026-01-29T03:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.924819 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.924898 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.924924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.924954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:26 crc kubenswrapper[4707]: I0129 03:28:26.924976 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:26Z","lastTransitionTime":"2026-01-29T03:28:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.027500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.027594 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.027610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.027639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.027655 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.131058 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.131146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.131164 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.131193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.131218 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.234083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.234149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.234167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.234193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.234211 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.242582 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.242650 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:27 crc kubenswrapper[4707]: E0129 03:28:27.242701 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:27 crc kubenswrapper[4707]: E0129 03:28:27.242870 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.242905 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:27 crc kubenswrapper[4707]: E0129 03:28:27.242977 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.265809 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.285469 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.299157 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.313461 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.338201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.338251 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.338270 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.338296 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.338315 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.350699 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.370072 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.390423 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.406878 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.420693 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.434765 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.442275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.442365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.442467 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.442495 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.442509 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.449788 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.464962 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.494142 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:20Z\\\",\\\"message\\\":\\\"or removal\\\\nI0129 03:28:20.294458 6383 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0129 03:28:20.294463 6383 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0129 03:28:20.294480 6383 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:20.294484 6383 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:20.294500 6383 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0129 03:28:20.294525 6383 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0129 03:28:20.294598 6383 factory.go:656] Stopping watch factory\\\\nI0129 03:28:20.294613 6383 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:20.294646 6383 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:20.294655 6383 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:20.294663 6383 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:20.294669 6383 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0129 03:28:20.294674 6383 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0129 03:28:20.294679 6383 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:20.294684 6383 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:20.294692 6383 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0129 03:28:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.511107 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.533569 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.546726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.546802 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.546821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.546848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.546868 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.558747 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.583628 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.611601 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:27Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.650006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.650060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.650073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.650092 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.650106 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.697396 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 18:35:47.362501399 +0000 UTC Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.753004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.753050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.753064 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.753083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.753100 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.855652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.855720 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.855734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.855749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.855762 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.958918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.958997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.959020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.959045 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:27 crc kubenswrapper[4707]: I0129 03:28:27.959064 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:27Z","lastTransitionTime":"2026-01-29T03:28:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.062153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.062227 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.062241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.062263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.062281 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.166088 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.166154 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.166172 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.166198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.166216 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.243408 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:28 crc kubenswrapper[4707]: E0129 03:28:28.243575 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.269063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.269112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.269122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.269139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.269162 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.372199 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.372254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.372272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.372293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.372310 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.475747 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.475809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.475824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.475851 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.475867 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.579079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.579131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.579145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.579166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.579181 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.682096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.682157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.682172 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.682199 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.682216 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.698055 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 16:03:26.47284219 +0000 UTC Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.784617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.784659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.784671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.784685 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.784694 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.888889 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.889341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.889434 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.889553 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.889662 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.993588 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.993653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.993672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.993703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:28 crc kubenswrapper[4707]: I0129 03:28:28.993719 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:28Z","lastTransitionTime":"2026-01-29T03:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.097167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.097225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.097249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.097282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.097306 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:29Z","lastTransitionTime":"2026-01-29T03:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.200704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.200765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.200784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.200808 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.200825 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:29Z","lastTransitionTime":"2026-01-29T03:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.243364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:29 crc kubenswrapper[4707]: E0129 03:28:29.243510 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.243650 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:29 crc kubenswrapper[4707]: E0129 03:28:29.243902 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.244314 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:29 crc kubenswrapper[4707]: E0129 03:28:29.244687 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.303598 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.304365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.304399 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.304434 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.304452 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:29Z","lastTransitionTime":"2026-01-29T03:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.407330 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.407431 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.407458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.407488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.407513 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:29Z","lastTransitionTime":"2026-01-29T03:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.510796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.510845 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.510858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.510879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.510893 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:29Z","lastTransitionTime":"2026-01-29T03:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.613903 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.613963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.613983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.614008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.614025 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:29Z","lastTransitionTime":"2026-01-29T03:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.698188 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:09:40.719972334 +0000 UTC Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.717090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.717157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.717170 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.717187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.717199 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:29Z","lastTransitionTime":"2026-01-29T03:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.820080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.820137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.820146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.820171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.820229 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:29Z","lastTransitionTime":"2026-01-29T03:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.923511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.923612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.923624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.923647 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:29 crc kubenswrapper[4707]: I0129 03:28:29.923660 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:29Z","lastTransitionTime":"2026-01-29T03:28:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.026877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.026936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.026955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.026979 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.026996 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.130720 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.130775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.130792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.130820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.130837 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.234491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.234600 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.234984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.235243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.235585 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.242918 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:30 crc kubenswrapper[4707]: E0129 03:28:30.243146 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.338005 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.338040 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.338049 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.338062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.338072 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.441236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.441307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.441320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.441339 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.441349 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.544454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.544492 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.544519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.544552 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.544563 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.648983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.649053 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.649072 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.649102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.649125 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.698824 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 13:47:23.594950368 +0000 UTC Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.752419 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.752469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.752490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.752517 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.752563 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.854971 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.855009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.855019 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.855033 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.855044 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.957301 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.957336 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.957346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.957360 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:30 crc kubenswrapper[4707]: I0129 03:28:30.957369 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:30Z","lastTransitionTime":"2026-01-29T03:28:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.059892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.059929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.059939 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.059956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.059968 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.163620 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.163666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.163677 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.163695 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.163707 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.242942 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.242983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.243025 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:31 crc kubenswrapper[4707]: E0129 03:28:31.243131 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:31 crc kubenswrapper[4707]: E0129 03:28:31.243284 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:31 crc kubenswrapper[4707]: E0129 03:28:31.243477 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.265976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.266031 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.266048 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.266076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.266100 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.368111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.368161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.368177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.368193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.368204 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.471008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.471069 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.471088 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.471113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.471133 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.574767 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.574827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.574844 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.574868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.574885 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.678422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.678480 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.678498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.678524 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.678572 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.699190 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 19:25:44.926111479 +0000 UTC Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.781286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.781340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.781358 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.781385 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.781402 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.884094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.884458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.884476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.884500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.884516 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.987001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.987065 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.987109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.987138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:31 crc kubenswrapper[4707]: I0129 03:28:31.987150 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:31Z","lastTransitionTime":"2026-01-29T03:28:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.089973 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.090014 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.090023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.090041 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.090050 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:32Z","lastTransitionTime":"2026-01-29T03:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.192856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.192922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.192940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.192961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.192975 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:32Z","lastTransitionTime":"2026-01-29T03:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.242805 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:32 crc kubenswrapper[4707]: E0129 03:28:32.243380 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.296474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.296973 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.297113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.297332 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.297486 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:32Z","lastTransitionTime":"2026-01-29T03:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.400136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.400726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.401133 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.401451 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.401701 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:32Z","lastTransitionTime":"2026-01-29T03:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.504587 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.504653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.504671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.504697 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.504714 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:32Z","lastTransitionTime":"2026-01-29T03:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.607345 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.607403 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.607424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.607451 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.607470 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:32Z","lastTransitionTime":"2026-01-29T03:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.700147 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 13:34:19.577362502 +0000 UTC Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.710434 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.710503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.710530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.710614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.710634 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:32Z","lastTransitionTime":"2026-01-29T03:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.814006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.814062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.814073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.814093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.814108 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:32Z","lastTransitionTime":"2026-01-29T03:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.917195 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.917261 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.917280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.917305 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:32 crc kubenswrapper[4707]: I0129 03:28:32.917326 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:32Z","lastTransitionTime":"2026-01-29T03:28:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.020957 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.021015 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.021030 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.021052 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.021068 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.123720 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.123777 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.123794 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.123821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.123838 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.226871 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.226922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.226931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.226949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.226963 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.243318 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:33 crc kubenswrapper[4707]: E0129 03:28:33.243468 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.243318 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.243578 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:33 crc kubenswrapper[4707]: E0129 03:28:33.243686 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:33 crc kubenswrapper[4707]: E0129 03:28:33.244045 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.329484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.329524 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.329551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.329569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.329584 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.432462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.432509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.432516 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.432555 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.432569 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.534902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.534937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.534946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.534962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.534972 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.637987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.638033 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.638044 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.638063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.638076 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.701273 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:15:24.296589746 +0000 UTC Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.741407 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.741447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.741456 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.741474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.741486 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.844933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.844999 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.845010 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.845071 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.845088 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.948385 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.948439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.948458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.948479 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:33 crc kubenswrapper[4707]: I0129 03:28:33.948491 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:33Z","lastTransitionTime":"2026-01-29T03:28:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.052405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.052487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.052514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.052593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.052621 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.125091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.125165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.125177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.125202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.125225 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: E0129 03:28:34.144582 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:34Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.149177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.149447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.149686 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.149877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.150060 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: E0129 03:28:34.169807 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:34Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.175934 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.176142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.176320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.176581 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.176764 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: E0129 03:28:34.195294 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:34Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.200669 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.200722 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.200736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.200755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.200766 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: E0129 03:28:34.217624 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:34Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.222295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.222386 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.222408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.222440 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.222463 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: E0129 03:28:34.237009 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:34Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:34 crc kubenswrapper[4707]: E0129 03:28:34.237247 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.239411 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.239461 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.239480 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.239508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.239529 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.242740 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:34 crc kubenswrapper[4707]: E0129 03:28:34.243018 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.244306 4707 scope.go:117] "RemoveContainer" containerID="e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7" Jan 29 03:28:34 crc kubenswrapper[4707]: E0129 03:28:34.244818 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.342758 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.342801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.342813 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.342834 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.342848 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.445635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.445672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.445682 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.445699 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.445711 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.549364 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.549411 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.549424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.549445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.549459 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.652267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.652321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.652332 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.652351 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.652364 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.702039 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 14:12:53.033268299 +0000 UTC Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.755366 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.755439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.755455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.755472 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.755487 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.858599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.858678 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.858691 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.858727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.858744 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.961181 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.961217 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.961233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.961248 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:34 crc kubenswrapper[4707]: I0129 03:28:34.961262 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:34Z","lastTransitionTime":"2026-01-29T03:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.064405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.064488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.064558 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.064583 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.064599 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.166759 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.166832 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.166845 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.166868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.166884 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.243330 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.243401 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.243356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:35 crc kubenswrapper[4707]: E0129 03:28:35.243552 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:35 crc kubenswrapper[4707]: E0129 03:28:35.243686 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:35 crc kubenswrapper[4707]: E0129 03:28:35.243874 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.269223 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.269285 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.269297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.269331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.269345 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.371805 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.371887 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.371906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.371940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.371962 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.474961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.475036 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.475048 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.475070 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.475087 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.578311 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.578373 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.578383 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.578404 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.578416 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.681731 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.681780 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.681793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.681815 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.681828 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.702228 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 12:24:46.92872156 +0000 UTC Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.784549 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.784592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.784602 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.784619 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.784632 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.887601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.887653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.887668 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.887691 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.887709 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.991071 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.991111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.991121 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.991137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:35 crc kubenswrapper[4707]: I0129 03:28:35.991150 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:35Z","lastTransitionTime":"2026-01-29T03:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.094181 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.094246 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.094256 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.094332 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.094348 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:36Z","lastTransitionTime":"2026-01-29T03:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.197102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.197155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.197165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.197281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.197296 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:36Z","lastTransitionTime":"2026-01-29T03:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.242680 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:36 crc kubenswrapper[4707]: E0129 03:28:36.242825 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.301418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.301486 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.301496 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.301512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.301522 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:36Z","lastTransitionTime":"2026-01-29T03:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.409741 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.409819 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.409835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.409873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.409893 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:36Z","lastTransitionTime":"2026-01-29T03:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.513606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.513996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.514089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.514185 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.514270 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:36Z","lastTransitionTime":"2026-01-29T03:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.616750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.616830 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.616868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.616894 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.616905 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:36Z","lastTransitionTime":"2026-01-29T03:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.703231 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 01:38:27.132349159 +0000 UTC Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.719420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.719496 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.719516 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.719573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.719591 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:36Z","lastTransitionTime":"2026-01-29T03:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.822380 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.822417 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.822426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.822443 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.822453 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:36Z","lastTransitionTime":"2026-01-29T03:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.925653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.925715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.925727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.925746 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:36 crc kubenswrapper[4707]: I0129 03:28:36.925761 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:36Z","lastTransitionTime":"2026-01-29T03:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.028300 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.028643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.028951 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.029133 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.029300 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.132731 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.132779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.132791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.132815 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.132830 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.236171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.236247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.236267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.236295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.236313 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.243550 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.243733 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.243839 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:37 crc kubenswrapper[4707]: E0129 03:28:37.243760 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:37 crc kubenswrapper[4707]: E0129 03:28:37.243968 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:37 crc kubenswrapper[4707]: E0129 03:28:37.244117 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.260310 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.276364 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.302625 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:20Z\\\",\\\"message\\\":\\\"or removal\\\\nI0129 03:28:20.294458 6383 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0129 03:28:20.294463 6383 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0129 03:28:20.294480 6383 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:20.294484 6383 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:20.294500 6383 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0129 03:28:20.294525 6383 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0129 03:28:20.294598 6383 factory.go:656] Stopping watch factory\\\\nI0129 03:28:20.294613 6383 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:20.294646 6383 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:20.294655 6383 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:20.294663 6383 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:20.294669 6383 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0129 03:28:20.294674 6383 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0129 03:28:20.294679 6383 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:20.294684 6383 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:20.294692 6383 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0129 03:28:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.316319 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.332337 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.339646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.339919 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.340006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.340094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.340164 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.349310 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.363704 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.378457 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.391374 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.403114 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.413445 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.435627 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.443516 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.443608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.443629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.443655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.443677 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.450613 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.465374 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.478092 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.487355 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.499238 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.509991 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:37Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.546346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.546385 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.546394 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.546415 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.546426 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.649475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.649589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.649613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.649641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.649669 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.704111 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 19:06:36.681638307 +0000 UTC Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.753105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.753148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.753161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.753178 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.753192 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.855523 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.855638 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.855665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.855705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.855733 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.960160 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.960227 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.960241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.960263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:37 crc kubenswrapper[4707]: I0129 03:28:37.960277 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:37Z","lastTransitionTime":"2026-01-29T03:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.063667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.063707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.063719 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.063735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.063747 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.167461 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.167580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.167609 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.167639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.167668 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.243598 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:38 crc kubenswrapper[4707]: E0129 03:28:38.243799 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.270453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.270515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.270555 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.270581 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.270601 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.372580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.372624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.372636 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.372657 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.372668 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.475077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.475141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.475153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.475172 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.475183 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.578624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.578674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.578683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.578702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.578713 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.681282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.681332 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.681344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.681369 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.681381 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.705224 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 01:03:41.984602655 +0000 UTC Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.784247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.784314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.784329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.784353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.784368 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.887434 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.887496 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.887513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.887563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.887579 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.990257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.990315 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.990327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.990348 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:38 crc kubenswrapper[4707]: I0129 03:28:38.990364 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:38Z","lastTransitionTime":"2026-01-29T03:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.077214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:39 crc kubenswrapper[4707]: E0129 03:28:39.077414 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:39 crc kubenswrapper[4707]: E0129 03:28:39.077501 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs podName:08dd724c-b8cc-45c6-9a61-13643a1c0d75 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:11.077481271 +0000 UTC m=+104.561710176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs") pod "network-metrics-daemon-652c6" (UID: "08dd724c-b8cc-45c6-9a61-13643a1c0d75") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.093356 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.093396 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.093409 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.093430 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.093444 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:39Z","lastTransitionTime":"2026-01-29T03:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.196426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.196471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.196484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.196506 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.196519 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:39Z","lastTransitionTime":"2026-01-29T03:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.243192 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.243327 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.243351 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:39 crc kubenswrapper[4707]: E0129 03:28:39.243488 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:39 crc kubenswrapper[4707]: E0129 03:28:39.243627 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:39 crc kubenswrapper[4707]: E0129 03:28:39.243718 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.299094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.299128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.299139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.299155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.299165 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:39Z","lastTransitionTime":"2026-01-29T03:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.401463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.401526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.401562 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.401586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.401602 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:39Z","lastTransitionTime":"2026-01-29T03:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.504710 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.504761 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.504774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.504801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.504815 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:39Z","lastTransitionTime":"2026-01-29T03:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.608437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.608490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.608508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.608529 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.608567 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:39Z","lastTransitionTime":"2026-01-29T03:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.705355 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 01:47:32.750654646 +0000 UTC Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.712247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.712297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.712309 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.712329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.712341 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:39Z","lastTransitionTime":"2026-01-29T03:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.815179 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.815557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.815658 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.815733 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.816252 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:39Z","lastTransitionTime":"2026-01-29T03:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.919579 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.919629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.919638 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.919656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:39 crc kubenswrapper[4707]: I0129 03:28:39.919667 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:39Z","lastTransitionTime":"2026-01-29T03:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.022812 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.022874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.022889 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.022909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.022924 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.125989 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.126043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.126054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.126073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.126085 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.229414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.229488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.229500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.229556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.229573 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.243089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:40 crc kubenswrapper[4707]: E0129 03:28:40.243288 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.332215 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.332284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.332303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.332338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.332358 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.435443 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.435513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.435566 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.435601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.435691 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.538778 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.538827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.538840 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.538856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.538868 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.641247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.641295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.641312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.641334 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.641350 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.706174 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 13:52:02.064514304 +0000 UTC Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.744751 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.744808 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.744826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.744850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.744868 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.813743 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/0.log" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.813821 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd938209-46da-4f33-8496-23beb193ac96" containerID="7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391" exitCode=1 Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.813868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vh9xt" event={"ID":"bd938209-46da-4f33-8496-23beb193ac96","Type":"ContainerDied","Data":"7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.814598 4707 scope.go:117] "RemoveContainer" containerID="7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.835473 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:40Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.847097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.847172 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.847191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.847244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.847263 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.851673 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:40Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.871575 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:20Z\\\",\\\"message\\\":\\\"or removal\\\\nI0129 03:28:20.294458 6383 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0129 03:28:20.294463 6383 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0129 03:28:20.294480 6383 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:20.294484 6383 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:20.294500 6383 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0129 03:28:20.294525 6383 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0129 03:28:20.294598 6383 factory.go:656] Stopping watch factory\\\\nI0129 03:28:20.294613 6383 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:20.294646 6383 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:20.294655 6383 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:20.294663 6383 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:20.294669 6383 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0129 03:28:20.294674 6383 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0129 03:28:20.294679 6383 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:20.294684 6383 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:20.294692 6383 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0129 03:28:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:40Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.885473 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:40Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.904143 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:40Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.930440 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:40Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.950191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.950240 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.950260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.950281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.950293 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:40Z","lastTransitionTime":"2026-01-29T03:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.954880 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:40Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.978642 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:40Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:40 crc kubenswrapper[4707]: I0129 03:28:40.991726 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:40Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.004884 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.014814 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.035262 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.050704 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.052760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.052827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.052840 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.052859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.052873 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.066275 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.078865 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.090817 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.105253 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:40Z\\\",\\\"message\\\":\\\"2026-01-29T03:27:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8\\\\n2026-01-29T03:27:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8 to /host/opt/cni/bin/\\\\n2026-01-29T03:27:55Z [verbose] multus-daemon started\\\\n2026-01-29T03:27:55Z [verbose] Readiness Indicator file check\\\\n2026-01-29T03:28:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.123964 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.155444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.155496 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.155510 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.155529 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.155558 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.243180 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.243311 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.243360 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:41 crc kubenswrapper[4707]: E0129 03:28:41.243523 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:41 crc kubenswrapper[4707]: E0129 03:28:41.243657 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:41 crc kubenswrapper[4707]: E0129 03:28:41.243785 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.258599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.258665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.258685 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.258715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.258735 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.361330 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.361381 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.361391 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.361408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.361418 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.464406 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.464443 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.464453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.464468 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.464478 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.567696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.567772 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.567792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.567821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.567842 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.670798 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.670865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.670884 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.670909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.670926 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.707194 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:40:33.511675078 +0000 UTC Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.774808 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.774886 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.774901 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.774925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.774943 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.820931 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/0.log" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.820996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vh9xt" event={"ID":"bd938209-46da-4f33-8496-23beb193ac96","Type":"ContainerStarted","Data":"8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.835522 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.856256 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.873602 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.878703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.878784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.878812 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.878841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.878861 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.893444 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.913253 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.933710 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.961459 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:40Z\\\",\\\"message\\\":\\\"2026-01-29T03:27:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8\\\\n2026-01-29T03:27:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8 to /host/opt/cni/bin/\\\\n2026-01-29T03:27:55Z [verbose] multus-daemon started\\\\n2026-01-29T03:27:55Z [verbose] Readiness Indicator file check\\\\n2026-01-29T03:28:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.979146 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:41Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.983036 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.983090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.983109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.983133 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:41 crc kubenswrapper[4707]: I0129 03:28:41.983160 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:41Z","lastTransitionTime":"2026-01-29T03:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.007342 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.024044 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.063651 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:20Z\\\",\\\"message\\\":\\\"or removal\\\\nI0129 03:28:20.294458 6383 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0129 03:28:20.294463 6383 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0129 03:28:20.294480 6383 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:20.294484 6383 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:20.294500 6383 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0129 03:28:20.294525 6383 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0129 03:28:20.294598 6383 factory.go:656] Stopping watch factory\\\\nI0129 03:28:20.294613 6383 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:20.294646 6383 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:20.294655 6383 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:20.294663 6383 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:20.294669 6383 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0129 03:28:20.294674 6383 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0129 03:28:20.294679 6383 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:20.294684 6383 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:20.294692 6383 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0129 03:28:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.084763 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.087962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.088023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.088048 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.088078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.088102 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:42Z","lastTransitionTime":"2026-01-29T03:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.106679 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.129404 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.150727 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.170989 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.192654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.193071 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.193261 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.193408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.193589 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:42Z","lastTransitionTime":"2026-01-29T03:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.213433 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.237782 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:42Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.242987 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:42 crc kubenswrapper[4707]: E0129 03:28:42.243211 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.298571 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.298875 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.299058 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.299210 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.299354 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:42Z","lastTransitionTime":"2026-01-29T03:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.402835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.402879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.402897 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.402922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.402939 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:42Z","lastTransitionTime":"2026-01-29T03:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.505955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.506005 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.506023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.506045 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.506067 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:42Z","lastTransitionTime":"2026-01-29T03:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.609340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.609406 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.609424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.609449 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.609469 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:42Z","lastTransitionTime":"2026-01-29T03:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.708690 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 17:59:19.103759626 +0000 UTC Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.712781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.712834 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.712850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.712874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.712889 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:42Z","lastTransitionTime":"2026-01-29T03:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.816978 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.817050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.817072 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.817099 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.817117 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:42Z","lastTransitionTime":"2026-01-29T03:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.920744 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.920815 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.920826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.920849 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:42 crc kubenswrapper[4707]: I0129 03:28:42.920863 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:42Z","lastTransitionTime":"2026-01-29T03:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.023675 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.023775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.023794 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.023822 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.023840 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.127988 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.128049 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.128071 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.128104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.128128 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.231736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.231789 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.231810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.231837 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.231857 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.243102 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.243140 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:43 crc kubenswrapper[4707]: E0129 03:28:43.243414 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.243137 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:43 crc kubenswrapper[4707]: E0129 03:28:43.243827 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:43 crc kubenswrapper[4707]: E0129 03:28:43.244181 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.334781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.334854 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.334874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.334901 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.334920 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.438806 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.438872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.438890 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.438916 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.438936 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.541811 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.541872 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.541890 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.541912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.541931 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.645310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.645382 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.645401 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.645429 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.645448 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.709289 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 01:26:56.118866589 +0000 UTC Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.749206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.749767 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.749787 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.749812 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.749830 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.852724 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.852787 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.852805 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.852836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.852855 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.956633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.957161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.957316 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.957447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:43 crc kubenswrapper[4707]: I0129 03:28:43.957650 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:43Z","lastTransitionTime":"2026-01-29T03:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.062102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.062484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.062676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.062849 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.063057 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.166737 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.166823 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.166841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.167492 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.167720 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.243534 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:44 crc kubenswrapper[4707]: E0129 03:28:44.244224 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.272040 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.272446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.272667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.272852 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.273242 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.376242 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.376634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.376772 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.376958 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.377093 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.420611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.420671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.420694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.420722 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.420741 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: E0129 03:28:44.440052 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:44Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.444910 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.444969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.444986 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.445010 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.445030 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: E0129 03:28:44.460163 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:44Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.464711 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.464809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.464863 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.464889 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.464911 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: E0129 03:28:44.486360 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:44Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.493027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.493105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.493121 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.493146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.493162 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: E0129 03:28:44.510467 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:44Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.517080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.517128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.517145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.517173 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.517190 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: E0129 03:28:44.542947 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:44Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:44 crc kubenswrapper[4707]: E0129 03:28:44.543303 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.546335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.546428 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.546454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.546485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.546504 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.649846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.649955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.650046 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.650085 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.650106 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.709594 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 21:34:35.998731842 +0000 UTC Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.754187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.754244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.754263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.754287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.754304 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.857588 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.857652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.857670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.857701 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.857720 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.960975 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.961049 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.961067 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.961093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:44 crc kubenswrapper[4707]: I0129 03:28:44.961111 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:44Z","lastTransitionTime":"2026-01-29T03:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.065312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.065391 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.065414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.065445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.065469 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.168967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.169040 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.169060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.169086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.169108 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.242726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:45 crc kubenswrapper[4707]: E0129 03:28:45.242914 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.242922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.242959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:45 crc kubenswrapper[4707]: E0129 03:28:45.243191 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:45 crc kubenswrapper[4707]: E0129 03:28:45.243341 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.271910 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.271993 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.272008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.272043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.272058 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.375320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.375383 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.375401 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.375426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.375444 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.477861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.477920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.477942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.477967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.477986 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.581434 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.581493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.581512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.581567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.581586 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.684915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.684976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.684998 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.685023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.685040 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.710503 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 07:01:55.454424771 +0000 UTC Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.788085 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.788166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.788179 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.788205 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.788219 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.891436 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.891532 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.891604 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.891641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.891672 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.994732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.994803 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.994826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.994856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:45 crc kubenswrapper[4707]: I0129 03:28:45.994879 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:45Z","lastTransitionTime":"2026-01-29T03:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.097766 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.097842 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.097869 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.097900 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.097924 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:46Z","lastTransitionTime":"2026-01-29T03:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.201770 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.201836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.201856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.201887 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.201911 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:46Z","lastTransitionTime":"2026-01-29T03:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.243139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:46 crc kubenswrapper[4707]: E0129 03:28:46.243314 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.244493 4707 scope.go:117] "RemoveContainer" containerID="e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.304865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.304917 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.304929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.304953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.304965 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:46Z","lastTransitionTime":"2026-01-29T03:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.408520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.408626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.408648 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.408675 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.408694 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:46Z","lastTransitionTime":"2026-01-29T03:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.511262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.511305 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.511338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.511361 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.511378 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:46Z","lastTransitionTime":"2026-01-29T03:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.614477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.614582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.614601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.614627 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.614647 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:46Z","lastTransitionTime":"2026-01-29T03:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.710623 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 03:08:28.035255139 +0000 UTC Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.718186 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.718254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.718284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.718312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.718334 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:46Z","lastTransitionTime":"2026-01-29T03:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.821044 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.821089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.821097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.821113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.821123 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:46Z","lastTransitionTime":"2026-01-29T03:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.852972 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/2.log" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.855832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.856497 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.872975 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.887616 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.902959 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.918474 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.923503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.923559 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.923570 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.923587 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.923598 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:46Z","lastTransitionTime":"2026-01-29T03:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.934569 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.949083 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.959948 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.972770 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.986111 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:40Z\\\",\\\"message\\\":\\\"2026-01-29T03:27:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8\\\\n2026-01-29T03:27:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8 to /host/opt/cni/bin/\\\\n2026-01-29T03:27:55Z [verbose] multus-daemon started\\\\n2026-01-29T03:27:55Z [verbose] Readiness Indicator file check\\\\n2026-01-29T03:28:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:46 crc kubenswrapper[4707]: I0129 03:28:46.996875 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:46Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.016053 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.026147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.026208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.026221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.026237 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.026247 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.030950 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.044425 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.058378 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.074565 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.091644 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.111666 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:20Z\\\",\\\"message\\\":\\\"or removal\\\\nI0129 03:28:20.294458 6383 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0129 03:28:20.294463 6383 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0129 03:28:20.294480 6383 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:20.294484 6383 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:20.294500 6383 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0129 03:28:20.294525 6383 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0129 03:28:20.294598 6383 factory.go:656] Stopping watch factory\\\\nI0129 03:28:20.294613 6383 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:20.294646 6383 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:20.294655 6383 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:20.294663 6383 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:20.294669 6383 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0129 03:28:20.294674 6383 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0129 03:28:20.294679 6383 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:20.294684 6383 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:20.294692 6383 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0129 03:28:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.127775 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.128916 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.128982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.128992 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.129011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.129037 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.232464 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.232510 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.232520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.232557 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.232567 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.242736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.242833 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:47 crc kubenswrapper[4707]: E0129 03:28:47.242856 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.242979 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:47 crc kubenswrapper[4707]: E0129 03:28:47.242980 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:47 crc kubenswrapper[4707]: E0129 03:28:47.243157 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.258866 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.274774 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.299169 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:20Z\\\",\\\"message\\\":\\\"or removal\\\\nI0129 03:28:20.294458 6383 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0129 03:28:20.294463 6383 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0129 03:28:20.294480 6383 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:20.294484 6383 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:20.294500 6383 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0129 03:28:20.294525 6383 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0129 03:28:20.294598 6383 factory.go:656] Stopping watch factory\\\\nI0129 03:28:20.294613 6383 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:20.294646 6383 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:20.294655 6383 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:20.294663 6383 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:20.294669 6383 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0129 03:28:20.294674 6383 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0129 03:28:20.294679 6383 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:20.294684 6383 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:20.294692 6383 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0129 03:28:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.316586 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.335563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.335630 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.335649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.335674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.335690 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.337761 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.360636 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.382681 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.407738 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.423896 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.437936 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.441011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.441060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.441078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.441104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.441123 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.450608 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.471475 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.485681 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.498836 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.511716 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.528045 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.543855 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.543903 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.543917 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.543942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.543959 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.549293 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:40Z\\\",\\\"message\\\":\\\"2026-01-29T03:27:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8\\\\n2026-01-29T03:27:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8 to /host/opt/cni/bin/\\\\n2026-01-29T03:27:55Z [verbose] multus-daemon started\\\\n2026-01-29T03:27:55Z [verbose] Readiness Indicator file check\\\\n2026-01-29T03:28:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.565377 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.648138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.648489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.648599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.648734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.648839 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.710781 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 12:42:02.565808875 +0000 UTC Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.752193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.752230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.752242 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.752262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.752274 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.855614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.855717 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.855738 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.855763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.855786 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.861468 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/3.log" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.862723 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/2.log" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.866354 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d" exitCode=1 Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.866408 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.866456 4707 scope.go:117] "RemoveContainer" containerID="e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.867526 4707 scope.go:117] "RemoveContainer" containerID="bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d" Jan 29 03:28:47 crc kubenswrapper[4707]: E0129 03:28:47.867862 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.883231 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.901962 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.921428 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e01deed9e5f2c9ecbb97d53eb68e0bc5214f38356c7718d2e55e8798ce99ded7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:20Z\\\",\\\"message\\\":\\\"or removal\\\\nI0129 03:28:20.294458 6383 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0129 03:28:20.294463 6383 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0129 03:28:20.294480 6383 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0129 03:28:20.294484 6383 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0129 03:28:20.294500 6383 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0129 03:28:20.294525 6383 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0129 03:28:20.294598 6383 factory.go:656] Stopping watch factory\\\\nI0129 03:28:20.294613 6383 ovnkube.go:599] Stopped ovnkube\\\\nI0129 03:28:20.294646 6383 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0129 03:28:20.294655 6383 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 03:28:20.294663 6383 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0129 03:28:20.294669 6383 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0129 03:28:20.294674 6383 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0129 03:28:20.294679 6383 handler.go:208] Removed *v1.Node event handler 2\\\\nI0129 03:28:20.294684 6383 handler.go:208] Removed *v1.Node event handler 7\\\\nI0129 03:28:20.294692 6383 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0129 03:28:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:47Z\\\",\\\"message\\\":\\\"alerter-4ln5h\\\\nI0129 03:28:47.167256 6786 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0129 03:28:47.167253 6786 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0129 03:28:47.167286 6786 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm\\\\nI0129 03:28:47.167282 6786 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0129 03:28:47.167295 6786 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm\\\\nI0129 03:28:47.167147 6786 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-652c6 in node crc\\\\nI0129 03:28:47.167308 6786 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm in node crc\\\\nI0129 03:28:47.167323 6786 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm after 0 failed attempt(s)\\\\nI0129 03:28:47.167334 6786 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-nn7fm\\\\nI0129 03:28:47.167164 6786 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.933794 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.950446 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.958140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.958167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.958175 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.958191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.958205 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:47Z","lastTransitionTime":"2026-01-29T03:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.965269 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.980488 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:47 crc kubenswrapper[4707]: I0129 03:28:47.999041 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:47Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.013513 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.027108 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.038213 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.051430 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.060928 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.061044 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.061056 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.061076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.061090 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.072186 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.086007 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.103257 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.117369 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.129077 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.142948 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:40Z\\\",\\\"message\\\":\\\"2026-01-29T03:27:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8\\\\n2026-01-29T03:27:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8 to /host/opt/cni/bin/\\\\n2026-01-29T03:27:55Z [verbose] multus-daemon started\\\\n2026-01-29T03:27:55Z [verbose] Readiness Indicator file check\\\\n2026-01-29T03:28:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.165102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.165166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.165177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.165198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.165214 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.242890 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:48 crc kubenswrapper[4707]: E0129 03:28:48.243096 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.268316 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.268398 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.268424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.268453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.268513 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.371919 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.371984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.371996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.372019 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.372035 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.475721 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.475782 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.475795 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.475818 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.475833 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.579264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.579343 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.579362 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.579389 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.579415 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.683244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.683318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.683335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.683361 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.683379 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.711717 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 10:25:11.642430443 +0000 UTC Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.786244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.786308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.786324 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.786349 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.786364 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.873512 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/3.log" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.879319 4707 scope.go:117] "RemoveContainer" containerID="bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d" Jan 29 03:28:48 crc kubenswrapper[4707]: E0129 03:28:48.879632 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.889055 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.889136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.889157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.889183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.889200 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.899684 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.918010 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.940620 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.963398 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.983440 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:48Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.992136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.992202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.992224 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.992254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:48 crc kubenswrapper[4707]: I0129 03:28:48.992275 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:48Z","lastTransitionTime":"2026-01-29T03:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.008342 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.028263 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.049650 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.066581 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.109147 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.111587 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.111655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.111673 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.111698 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.111717 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:49Z","lastTransitionTime":"2026-01-29T03:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.136604 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:40Z\\\",\\\"message\\\":\\\"2026-01-29T03:27:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8\\\\n2026-01-29T03:27:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8 to /host/opt/cni/bin/\\\\n2026-01-29T03:27:55Z [verbose] multus-daemon started\\\\n2026-01-29T03:27:55Z [verbose] Readiness Indicator file check\\\\n2026-01-29T03:28:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.161662 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.202748 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.214477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.214679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.214765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.214847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.214969 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:49Z","lastTransitionTime":"2026-01-29T03:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.227516 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.243421 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.243514 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.243571 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:49 crc kubenswrapper[4707]: E0129 03:28:49.243723 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:49 crc kubenswrapper[4707]: E0129 03:28:49.243963 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:49 crc kubenswrapper[4707]: E0129 03:28:49.244067 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.264643 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:47Z\\\",\\\"message\\\":\\\"alerter-4ln5h\\\\nI0129 03:28:47.167256 6786 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0129 03:28:47.167253 6786 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0129 03:28:47.167286 6786 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm\\\\nI0129 03:28:47.167282 6786 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0129 03:28:47.167295 6786 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm\\\\nI0129 03:28:47.167147 6786 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-652c6 in node crc\\\\nI0129 03:28:47.167308 6786 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm in node crc\\\\nI0129 03:28:47.167323 6786 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm after 0 failed attempt(s)\\\\nI0129 03:28:47.167334 6786 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-nn7fm\\\\nI0129 03:28:47.167164 6786 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.282504 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.299257 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.315782 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:49Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.317458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.317499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.317514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.317532 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.317568 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:49Z","lastTransitionTime":"2026-01-29T03:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.420633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.420698 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.420710 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.420730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.420740 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:49Z","lastTransitionTime":"2026-01-29T03:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.524774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.525145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.525220 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.525304 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.525376 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:49Z","lastTransitionTime":"2026-01-29T03:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.629066 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.629457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.629528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.629644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.629711 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:49Z","lastTransitionTime":"2026-01-29T03:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.712074 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 14:53:47.237059625 +0000 UTC Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.732576 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.732629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.732648 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.732673 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.732692 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:49Z","lastTransitionTime":"2026-01-29T03:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.835575 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.835608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.835617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.835635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.835644 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:49Z","lastTransitionTime":"2026-01-29T03:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.938929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.938977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.938997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.939022 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:49 crc kubenswrapper[4707]: I0129 03:28:49.939040 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:49Z","lastTransitionTime":"2026-01-29T03:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.041705 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.041743 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.041754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.041772 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.041785 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.144824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.144885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.144904 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.144929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.144950 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.242659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.242892 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.248648 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.248715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.248729 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.248755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.248776 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.353035 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.353099 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.353118 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.353144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.353161 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.456997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.457069 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.457090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.457114 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.457132 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.528851 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.529016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529073 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.529031866 +0000 UTC m=+148.013260811 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.529175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.529241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529179 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529337 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529370 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529390 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.529397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529471 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.529446509 +0000 UTC m=+148.013675454 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529504 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.5294889 +0000 UTC m=+148.013717835 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529665 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529667 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529691 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529714 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529743 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.529729337 +0000 UTC m=+148.013958272 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 03:28:50 crc kubenswrapper[4707]: E0129 03:28:50.529772 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.529756818 +0000 UTC m=+148.013985763 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.559915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.559973 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.559992 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.560018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.560036 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.663758 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.663820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.663831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.663855 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.663873 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.712631 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 17:29:31.844021553 +0000 UTC Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.767669 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.767751 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.767801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.767829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.767846 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.871281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.871375 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.871426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.871457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.871533 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.974440 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.974509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.974564 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.974585 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:50 crc kubenswrapper[4707]: I0129 03:28:50.974598 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:50Z","lastTransitionTime":"2026-01-29T03:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.076569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.076621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.076634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.076657 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.076670 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:51Z","lastTransitionTime":"2026-01-29T03:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.179323 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.179653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.179692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.179721 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.179781 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:51Z","lastTransitionTime":"2026-01-29T03:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.243165 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.243168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.243207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:51 crc kubenswrapper[4707]: E0129 03:28:51.243647 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:51 crc kubenswrapper[4707]: E0129 03:28:51.243814 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:51 crc kubenswrapper[4707]: E0129 03:28:51.243886 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.259273 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.282600 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.282628 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.282637 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.282679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.282692 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:51Z","lastTransitionTime":"2026-01-29T03:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.385908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.385944 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.385952 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.385970 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.385981 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:51Z","lastTransitionTime":"2026-01-29T03:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.488228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.488261 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.488271 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.488286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.488298 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:51Z","lastTransitionTime":"2026-01-29T03:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.590251 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.590520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.590670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.590768 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.590829 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:51Z","lastTransitionTime":"2026-01-29T03:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.693032 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.693246 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.693256 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.693273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.693286 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:51Z","lastTransitionTime":"2026-01-29T03:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.713991 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 07:55:30.362691461 +0000 UTC Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.796046 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.796131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.796145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.796174 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.796186 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:51Z","lastTransitionTime":"2026-01-29T03:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.898561 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.898614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.898626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.898646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:51 crc kubenswrapper[4707]: I0129 03:28:51.898660 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:51Z","lastTransitionTime":"2026-01-29T03:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.001643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.001688 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.001700 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.001722 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.001743 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.106264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.106729 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.106937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.107108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.107313 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.211088 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.211161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.211187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.211218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.211240 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.243397 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:52 crc kubenswrapper[4707]: E0129 03:28:52.243811 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.315586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.315936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.316068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.316243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.316993 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.421236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.421736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.422004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.422257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.422465 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.525290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.525694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.525775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.525843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.525897 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.628812 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.628882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.628903 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.628931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.628954 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.715079 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 02:27:27.135867794 +0000 UTC Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.731387 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.731462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.731481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.731509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.731530 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.834339 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.834404 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.834415 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.834441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.834456 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.937745 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.937851 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.937869 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.937893 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:52 crc kubenswrapper[4707]: I0129 03:28:52.937907 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:52Z","lastTransitionTime":"2026-01-29T03:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.042010 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.042089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.042114 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.042147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.042170 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.145925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.146000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.146026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.146057 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.146081 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.243724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.243789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.243724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:53 crc kubenswrapper[4707]: E0129 03:28:53.243960 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:53 crc kubenswrapper[4707]: E0129 03:28:53.244163 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:53 crc kubenswrapper[4707]: E0129 03:28:53.244331 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.249758 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.249817 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.249835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.249859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.249878 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.352931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.353431 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.353615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.353791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.353931 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.457381 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.457441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.457451 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.457475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.457494 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.560612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.560658 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.560669 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.560692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.560704 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.664238 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.664691 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.664831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.664984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.665177 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.715672 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 23:03:01.725021267 +0000 UTC Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.768719 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.768772 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.768784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.768805 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.768818 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.872339 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.872431 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.872453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.872511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.872577 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.975098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.975153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.975162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.975182 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:53 crc kubenswrapper[4707]: I0129 03:28:53.975194 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:53Z","lastTransitionTime":"2026-01-29T03:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.078256 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.078299 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.078314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.078344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.078361 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.181349 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.181449 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.181477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.181512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.181595 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.242773 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:54 crc kubenswrapper[4707]: E0129 03:28:54.243011 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.284916 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.284976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.284996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.285024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.285043 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.388975 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.389046 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.389082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.389128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.389149 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.492494 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.492613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.492640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.492670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.492692 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.597067 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.597151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.597176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.597208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.597227 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.700711 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.700775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.700802 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.700831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.700850 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.702752 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.702834 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.702857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.702891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.702916 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.716782 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 13:15:08.054297569 +0000 UTC Jan 29 03:28:54 crc kubenswrapper[4707]: E0129 03:28:54.729505 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.735703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.735779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.735804 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.735838 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.735863 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: E0129 03:28:54.752896 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.757919 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.757982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.758006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.758037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.758061 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: E0129 03:28:54.779110 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.785756 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.785821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.785838 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.785869 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.785887 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: E0129 03:28:54.806737 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.812882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.812949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.812969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.812994 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.813013 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: E0129 03:28:54.834425 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45db4fed-0e87-470b-b8f4-53e2ae2c5a04\\\",\\\"systemUUID\\\":\\\"e4a0c178-c4c4-4abe-9640-7d6723ff2d92\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:54Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:54 crc kubenswrapper[4707]: E0129 03:28:54.834720 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.836996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.837074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.837090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.837111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.837125 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.939890 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.939963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.939982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.940009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:54 crc kubenswrapper[4707]: I0129 03:28:54.940027 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:54Z","lastTransitionTime":"2026-01-29T03:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.044022 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.044098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.044118 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.044148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.044165 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.147914 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.147979 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.148006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.148042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.148068 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.243644 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.243667 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.243773 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:55 crc kubenswrapper[4707]: E0129 03:28:55.244012 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:55 crc kubenswrapper[4707]: E0129 03:28:55.244141 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:55 crc kubenswrapper[4707]: E0129 03:28:55.244274 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.250634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.250679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.250699 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.250726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.250746 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.354511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.354628 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.354651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.354682 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.354703 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.458673 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.458978 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.459155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.459306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.459490 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.563193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.563250 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.563271 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.563296 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.563314 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.666976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.667049 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.667072 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.667109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.667145 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.718004 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 14:44:29.103686222 +0000 UTC Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.770902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.770955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.770968 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.770990 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.771007 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.874741 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.875113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.875257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.875422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.875595 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.979497 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.979666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.979688 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.979755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:55 crc kubenswrapper[4707]: I0129 03:28:55.979775 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:55Z","lastTransitionTime":"2026-01-29T03:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.082515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.082593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.082609 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.082630 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.082645 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:56Z","lastTransitionTime":"2026-01-29T03:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.188344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.188427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.188440 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.188462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.188479 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:56Z","lastTransitionTime":"2026-01-29T03:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.243581 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:56 crc kubenswrapper[4707]: E0129 03:28:56.243743 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.291374 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.291438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.291458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.291487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.291504 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:56Z","lastTransitionTime":"2026-01-29T03:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.394774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.394845 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.394865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.394891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.394908 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:56Z","lastTransitionTime":"2026-01-29T03:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.498233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.498318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.498344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.498380 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.498402 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:56Z","lastTransitionTime":"2026-01-29T03:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.602269 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.602322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.602339 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.602364 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.602383 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:56Z","lastTransitionTime":"2026-01-29T03:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.709465 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.709511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.709523 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.709555 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.709565 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:56Z","lastTransitionTime":"2026-01-29T03:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.719091 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 02:24:39.643382489 +0000 UTC Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.811838 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.811875 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.811885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.811900 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.811908 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:56Z","lastTransitionTime":"2026-01-29T03:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.916967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.917039 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.917062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.917093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:56 crc kubenswrapper[4707]: I0129 03:28:56.917118 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:56Z","lastTransitionTime":"2026-01-29T03:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.020222 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.020281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.020299 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.020322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.020339 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.123972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.124036 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.124057 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.124085 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.124103 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.227064 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.227114 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.227135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.227158 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.227181 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.242619 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.242682 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:57 crc kubenswrapper[4707]: E0129 03:28:57.242805 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.242916 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:57 crc kubenswrapper[4707]: E0129 03:28:57.243054 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:57 crc kubenswrapper[4707]: E0129 03:28:57.243202 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.268814 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e7fe9a-65cf-4ad7-903c-54528d0317e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f204a947071a975f063fddf3d243e558f3c5b845d71d3b359c24e5dc4847fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e800b6f9054350c622bdc9d2ec08c968eb574c2857f7804959aa17e4d279f08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55419a1da7fc9f1653bf80dc5e304f9536c9a06cb2b0546a764542eb9a521672\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.291251 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b267753eb23edfe08efa581a1d8159ad4b513eb51c90b82316a9b358da4f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.309257 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pf578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3b3d28-c2ba-4aea-b865-e72c6327eb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee4a345b764ffebf5094737cdc638e81b0c7549a410f0133e2d79c34553723a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pf578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.331246 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.331371 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.331448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.331486 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.331658 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.336973 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ccb9a1-a472-4658-bd4a-21f40131ed03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a11430a75be824f0a0a64235fa7a3615ae90a473e9a2ab23d006dc5e7cda0ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ca20c19a05dd1ecf8f62c4cb4891d4857e86211d94a4cc2dfc390babf869b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34460252ca0fabc7f9c56f4cc2e80fb327e1011d4f5dbf006d40bd0eb9682fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://645479fe48ed31cb446c7c65f7c621ecb5162a40f3a888e3440dc4d8c1d49bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14db1f4cd51483e1d8bfc90fd9fa8e8bf5307270fab2aa964fdf2523605df60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f0f3c3e0804fe16cf2969677937dcc301fe1b985b8d0fbad9817cf480de9eb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0023fcf83103560a4ddb4e47142b9efe3a978c503b501b2517a811db0a36ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ed43ddf436e8f819552ae9fc379db34484d67f13098022f2c87677d51b07009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.361075 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.381958 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.401245 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df12d101-b13d-4276-94b7-422c6609d2e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1cf473cb095ec6662f6e12c0df20449ddaeef51283b8ad8b8da6d44bd608db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzldv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hbz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.421624 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-t4vft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d25a01e8-a854-424f-a238-e41c41cea5f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d7613f142dea5d38e603aa3c78ab5fe835a6f65a18d0cf0d7020645c993798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhhx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-t4vft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.435112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.435173 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.435190 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.435217 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.435234 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.446853 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vh9xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd938209-46da-4f33-8496-23beb193ac96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:40Z\\\",\\\"message\\\":\\\"2026-01-29T03:27:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8\\\\n2026-01-29T03:27:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4370c61c-a894-4939-8630-96f999fa60a8 to /host/opt/cni/bin/\\\\n2026-01-29T03:27:55Z [verbose] multus-daemon started\\\\n2026-01-29T03:27:55Z [verbose] Readiness Indicator file check\\\\n2026-01-29T03:28:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bxcpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vh9xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.470684 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-652c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08dd724c-b8cc-45c6-9a61-13643a1c0d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zmnt5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-652c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.493316 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44897706-8ced-442a-a218-8d24adb413fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c22b2e1add99ebfb386ba6cc5ccc0812ee629347d45b8e4551bc7cc7e8b255b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c9e5acb69f6084045c9d96f1d42a1705e7842776e09bc1a6a85891b879eb95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c7fe7e2da837aae906bb6bea8d46ac7e1045a17e774d1b8f8e06cf12b66cc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://710435f00f5ebe3e8dc656cf772b4ff53b57bb80b642bf4110ca5724d4763b76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.518623 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://276ba73b2a887f34e1eb1ecf0e1aa99ffffbbd05c249c68f49def526558a543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.537982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.538051 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.538071 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.538097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.538116 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.551939 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T03:28:47Z\\\",\\\"message\\\":\\\"alerter-4ln5h\\\\nI0129 03:28:47.167256 6786 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0129 03:28:47.167253 6786 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0129 03:28:47.167286 6786 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm\\\\nI0129 03:28:47.167282 6786 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0129 03:28:47.167295 6786 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm\\\\nI0129 03:28:47.167147 6786 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-652c6 in node crc\\\\nI0129 03:28:47.167308 6786 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm in node crc\\\\nI0129 03:28:47.167323 6786 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nn7fm after 0 failed attempt(s)\\\\nI0129 03:28:47.167334 6786 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-nn7fm\\\\nI0129 03:28:47.167164 6786 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:28:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxg4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nn7fm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.570085 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91cba8d8-e784-454a-8397-936cb3a94b79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5af305dd86ccb0f622ab429c1496d1fb2cc2187c71583809ed3075ae60511c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927806fb77861764fef7e35e6d610484cd671fdc4b7edbb0275820ec3cb1cc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:28:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dqhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:28:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s2v9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.585523 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dd9937b-220e-4249-89b9-d6b1833146d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b110d8ba6f6270c5c3d47918b6434a2bfc50ed4cc695b2e5b8ba53d384d201f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a96e5d7ff7f586055be04ab49393e4c0751d23abf2700fd527e68cee822b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83a96e5d7ff7f586055be04ab49393e4c0751d23abf2700fd527e68cee822b29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.610888 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T03:27:41Z\\\",\\\"message\\\":\\\"W0129 03:27:30.425238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0129 03:27:30.426344 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769657250 cert, and key in /tmp/serving-cert-729004022/serving-signer.crt, /tmp/serving-cert-729004022/serving-signer.key\\\\nI0129 03:27:30.645147 1 observer_polling.go:159] Starting file observer\\\\nW0129 03:27:30.650054 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0129 03:27:30.650228 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 03:27:30.652693 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-729004022/tls.crt::/tmp/serving-cert-729004022/tls.key\\\\\\\"\\\\nF0129 03:27:41.112370 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.633486 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f10b51cb1c2bdedc263c46def352dc75958c33d9a1ee4fb4e26a03481f4c1d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a79961e976c0ae1ac5f5d20bf64d87d5e070b26abf07c186f9d945b0814ea3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.641658 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.641737 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.641762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.641791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.641813 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.655355 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.676473 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lnjls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bedc9546-6a4d-44ec-b95f-84c3329307cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T03:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5181466e055379b77c27cac523ec49fe719bd5b5065e7d0af72e49cec69f642d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d22e5135d9219c5f040858ebeba25d2688fdd7f5b762c6a60c973df4563fa457\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae4a264a6f288922afb5869850606670fdb3be59948efec3cbf3da6ac370e84d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c828811665c0f402d815ab6e8c6ed63551dcc9b0f4063585e68c75be8353671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c9cb00c9430e902bdffabb8edbf66f4a6fd2c18553a860b7505c424bd1f5d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b979f13c78b3204e595f595e04cefce001b616d244344e6aae1bc210064814c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fac748c78ea0fae04e28b95bc12511e0550fd33ef8d9ecade9b86b1ba9ebab86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:27:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:27:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbztl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T03:27:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lnjls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T03:28:57Z is after 2025-08-24T17:21:41Z" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.720157 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 08:18:42.131072373 +0000 UTC Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.744427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.744517 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.744613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.744684 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.744703 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.847889 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.848460 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.848788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.849025 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.849464 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.952801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.952883 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.952908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.952934 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:57 crc kubenswrapper[4707]: I0129 03:28:57.952953 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:57Z","lastTransitionTime":"2026-01-29T03:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.055732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.055796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.055813 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.055867 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.055886 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.159867 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.159923 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.159941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.159966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.159983 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.242699 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:28:58 crc kubenswrapper[4707]: E0129 03:28:58.242893 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.262840 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.262899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.262921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.262952 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.262971 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.366464 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.366533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.366612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.366643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.366668 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.470403 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.470459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.470478 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.470503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.470521 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.573639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.573696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.573715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.573740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.573757 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.676269 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.676303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.676311 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.676324 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.676334 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.721215 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 21:35:36.12784732 +0000 UTC Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.779436 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.779470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.779512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.779526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.779555 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.883254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.883744 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.883911 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.884043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.884144 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.987418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.987483 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.987503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.987528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:58 crc kubenswrapper[4707]: I0129 03:28:58.987580 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:58Z","lastTransitionTime":"2026-01-29T03:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.090624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.090698 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.090723 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.090753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.090774 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:59Z","lastTransitionTime":"2026-01-29T03:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.195143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.195227 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.195249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.195313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.195338 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:59Z","lastTransitionTime":"2026-01-29T03:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.243322 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.243614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:28:59 crc kubenswrapper[4707]: E0129 03:28:59.243751 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.243867 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:28:59 crc kubenswrapper[4707]: E0129 03:28:59.243973 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:28:59 crc kubenswrapper[4707]: E0129 03:28:59.244610 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.299654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.299708 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.299726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.299750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.299768 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:59Z","lastTransitionTime":"2026-01-29T03:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.403241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.403352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.403366 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.403391 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.403407 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:59Z","lastTransitionTime":"2026-01-29T03:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.506655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.506758 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.506783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.506825 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.506844 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:59Z","lastTransitionTime":"2026-01-29T03:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.609995 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.610056 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.610074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.610100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.610119 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:59Z","lastTransitionTime":"2026-01-29T03:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.713028 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.713086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.713105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.713127 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.713145 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:59Z","lastTransitionTime":"2026-01-29T03:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.721790 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 09:17:42.228654238 +0000 UTC Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.816441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.816522 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.816577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.816606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.816624 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:59Z","lastTransitionTime":"2026-01-29T03:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.920082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.920151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.920170 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.920197 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:28:59 crc kubenswrapper[4707]: I0129 03:28:59.920215 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:28:59Z","lastTransitionTime":"2026-01-29T03:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.023624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.023726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.023743 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.023823 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.023839 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.127340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.127661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.127797 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.127930 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.128051 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.231135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.231210 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.231229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.231257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.231281 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.242807 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:00 crc kubenswrapper[4707]: E0129 03:29:00.243028 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.334651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.335681 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.335732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.335763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.335787 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.439853 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.439920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.439942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.439968 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.439986 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.544009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.544077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.544096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.544120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.544137 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.647341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.647405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.647417 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.647440 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.647455 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.722919 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 22:11:47.385984377 +0000 UTC Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.750784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.750852 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.750871 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.750901 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.750925 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.854162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.854231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.854248 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.854275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.854296 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.957072 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.957507 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.957734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.957946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:00 crc kubenswrapper[4707]: I0129 03:29:00.958098 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:00Z","lastTransitionTime":"2026-01-29T03:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.061269 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.061317 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.061328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.061347 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.061361 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.164868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.164945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.164969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.164998 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.165016 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.242622 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.242670 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.242704 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:01 crc kubenswrapper[4707]: E0129 03:29:01.242806 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:01 crc kubenswrapper[4707]: E0129 03:29:01.242928 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:01 crc kubenswrapper[4707]: E0129 03:29:01.243076 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.268207 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.268265 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.268284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.268312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.268329 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.371258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.371301 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.371311 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.371327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.371338 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.474318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.474395 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.474420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.474446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.474465 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.577678 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.577741 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.577759 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.577780 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.577799 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.680744 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.680785 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.680800 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.680820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.680833 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.723436 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 01:16:27.559926467 +0000 UTC Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.783865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.783909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.783920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.783938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.783952 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.886861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.886940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.886961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.886990 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.887010 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.990362 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.990430 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.990450 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.990474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:01 crc kubenswrapper[4707]: I0129 03:29:01.990491 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:01Z","lastTransitionTime":"2026-01-29T03:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.093787 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.093940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.093960 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.093987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.094004 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:02Z","lastTransitionTime":"2026-01-29T03:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.197862 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.197943 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.197967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.197998 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.198021 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:02Z","lastTransitionTime":"2026-01-29T03:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.242920 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:02 crc kubenswrapper[4707]: E0129 03:29:02.243054 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.244642 4707 scope.go:117] "RemoveContainer" containerID="bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d" Jan 29 03:29:02 crc kubenswrapper[4707]: E0129 03:29:02.245078 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.300756 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.300826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.300851 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.300882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.300904 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:02Z","lastTransitionTime":"2026-01-29T03:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.404236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.404297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.404314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.404340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.404356 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:02Z","lastTransitionTime":"2026-01-29T03:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.507949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.508022 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.508041 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.508068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.508088 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:02Z","lastTransitionTime":"2026-01-29T03:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.611333 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.611402 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.611421 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.611445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.611464 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:02Z","lastTransitionTime":"2026-01-29T03:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.715055 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.715117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.715136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.715160 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.715177 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:02Z","lastTransitionTime":"2026-01-29T03:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.724454 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:23:14.097765311 +0000 UTC Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.819259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.819328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.819352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.819382 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.819402 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:02Z","lastTransitionTime":"2026-01-29T03:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.922136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.922200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.922303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.922336 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:02 crc kubenswrapper[4707]: I0129 03:29:02.922359 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:02Z","lastTransitionTime":"2026-01-29T03:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.026406 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.026459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.026470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.026487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.026498 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.131090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.131171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.131192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.131226 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.131246 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.234669 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.234727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.234747 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.234773 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.234790 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.242983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.243099 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:03 crc kubenswrapper[4707]: E0129 03:29:03.243310 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.243446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:03 crc kubenswrapper[4707]: E0129 03:29:03.243797 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:03 crc kubenswrapper[4707]: E0129 03:29:03.243964 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.337174 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.337257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.337275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.337300 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.337318 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.440616 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.440661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.440672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.440690 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.440702 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.544074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.544155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.544180 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.544211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.544236 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.647989 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.648078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.648104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.648140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.648166 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.725342 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 15:22:32.697029521 +0000 UTC Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.752777 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.752865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.752891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.752922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.752944 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.856638 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.856704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.856721 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.856749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.856766 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.960020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.960080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.960100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.960125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:03 crc kubenswrapper[4707]: I0129 03:29:03.960141 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:03Z","lastTransitionTime":"2026-01-29T03:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.062998 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.063050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.063068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.063095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.063115 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.166481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.166589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.166610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.166660 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.166682 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.243277 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:04 crc kubenswrapper[4707]: E0129 03:29:04.243823 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.269841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.269900 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.269922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.269945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.269963 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.373840 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.374241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.374437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.374677 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.374885 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.478809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.478880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.478899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.478928 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.478946 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.582091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.582168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.582189 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.582218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.582237 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.685257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.685335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.685357 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.685384 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.685404 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.726477 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 16:03:18.367142618 +0000 UTC Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.789047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.789104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.789127 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.789158 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.789180 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.892926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.893038 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.893054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.893079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.893097 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.921198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.921254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.921280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.921308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.921326 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T03:29:04Z","lastTransitionTime":"2026-01-29T03:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.998320 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr"] Jan 29 03:29:04 crc kubenswrapper[4707]: I0129 03:29:04.998947 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: W0129 03:29:05.002967 4707 reflector.go:561] object-"openshift-cluster-version"/"cluster-version-operator-serving-cert": failed to list *v1.Secret: secrets "cluster-version-operator-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Jan 29 03:29:05 crc kubenswrapper[4707]: E0129 03:29:05.003034 4707 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-version-operator-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 29 03:29:05 crc kubenswrapper[4707]: W0129 03:29:05.003400 4707 reflector.go:561] object-"openshift-cluster-version"/"default-dockercfg-gxtc4": failed to list *v1.Secret: secrets "default-dockercfg-gxtc4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Jan 29 03:29:05 crc kubenswrapper[4707]: E0129 03:29:05.003458 4707 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"default-dockercfg-gxtc4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-gxtc4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 29 03:29:05 crc kubenswrapper[4707]: W0129 03:29:05.003774 4707 reflector.go:561] object-"openshift-cluster-version"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Jan 29 03:29:05 crc kubenswrapper[4707]: E0129 03:29:05.003835 4707 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.004913 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.050616 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t4vft" podStartSLOduration=73.050591212 podStartE2EDuration="1m13.050591212s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.024956592 +0000 UTC m=+98.509185527" watchObservedRunningTime="2026-01-29 03:29:05.050591212 +0000 UTC m=+98.534820127" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.066453 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vh9xt" podStartSLOduration=73.066382876 podStartE2EDuration="1m13.066382876s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.051153669 +0000 UTC m=+98.535382614" watchObservedRunningTime="2026-01-29 03:29:05.066382876 +0000 UTC m=+98.550611821" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.101344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383774bb-eff1-4d4c-a455-05bfd2ffd98c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.101713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/383774bb-eff1-4d4c-a455-05bfd2ffd98c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.101862 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/383774bb-eff1-4d4c-a455-05bfd2ffd98c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.101996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/383774bb-eff1-4d4c-a455-05bfd2ffd98c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.102150 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/383774bb-eff1-4d4c-a455-05bfd2ffd98c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.126147 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.126099229 podStartE2EDuration="1m18.126099229s" podCreationTimestamp="2026-01-29 03:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.107533932 +0000 UTC m=+98.591762897" watchObservedRunningTime="2026-01-29 03:29:05.126099229 +0000 UTC m=+98.610328154" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.170227 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podStartSLOduration=73.170209053 podStartE2EDuration="1m13.170209053s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.169736949 +0000 UTC m=+98.653965854" watchObservedRunningTime="2026-01-29 03:29:05.170209053 +0000 UTC m=+98.654437958" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.203604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383774bb-eff1-4d4c-a455-05bfd2ffd98c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.203670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/383774bb-eff1-4d4c-a455-05bfd2ffd98c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.203699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/383774bb-eff1-4d4c-a455-05bfd2ffd98c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.203748 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/383774bb-eff1-4d4c-a455-05bfd2ffd98c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.203811 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/383774bb-eff1-4d4c-a455-05bfd2ffd98c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.203877 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/383774bb-eff1-4d4c-a455-05bfd2ffd98c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.203888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/383774bb-eff1-4d4c-a455-05bfd2ffd98c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.205459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/383774bb-eff1-4d4c-a455-05bfd2ffd98c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.229569 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.229504984 podStartE2EDuration="46.229504984s" podCreationTimestamp="2026-01-29 03:28:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.208869344 +0000 UTC m=+98.693098259" watchObservedRunningTime="2026-01-29 03:29:05.229504984 +0000 UTC m=+98.713733929" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.243078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.243110 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.243092 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:05 crc kubenswrapper[4707]: E0129 03:29:05.243322 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:05 crc kubenswrapper[4707]: E0129 03:29:05.243641 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:05 crc kubenswrapper[4707]: E0129 03:29:05.243801 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.285927 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s2v9m" podStartSLOduration=73.285909057 podStartE2EDuration="1m13.285909057s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.285294479 +0000 UTC m=+98.769523434" watchObservedRunningTime="2026-01-29 03:29:05.285909057 +0000 UTC m=+98.770137962" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.346146 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lnjls" podStartSLOduration=73.346113365 podStartE2EDuration="1m13.346113365s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.344864167 +0000 UTC m=+98.829093062" watchObservedRunningTime="2026-01-29 03:29:05.346113365 +0000 UTC m=+98.830342270" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.383986 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=14.383969891 podStartE2EDuration="14.383969891s" podCreationTimestamp="2026-01-29 03:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.365582389 +0000 UTC m=+98.849811294" watchObservedRunningTime="2026-01-29 03:29:05.383969891 +0000 UTC m=+98.868198796" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.384068 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.384063514 podStartE2EDuration="1m20.384063514s" podCreationTimestamp="2026-01-29 03:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.383439925 +0000 UTC m=+98.867668830" watchObservedRunningTime="2026-01-29 03:29:05.384063514 +0000 UTC m=+98.868292419" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.443053 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=74.443031974 podStartE2EDuration="1m14.443031974s" podCreationTimestamp="2026-01-29 03:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.430967512 +0000 UTC m=+98.915196417" watchObservedRunningTime="2026-01-29 03:29:05.443031974 +0000 UTC m=+98.927260879" Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.726755 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 19:31:51.96137842 +0000 UTC Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.726849 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 29 03:29:05 crc kubenswrapper[4707]: I0129 03:29:05.740654 4707 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 29 03:29:06 crc kubenswrapper[4707]: E0129 03:29:06.204584 4707 secret.go:188] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:06 crc kubenswrapper[4707]: E0129 03:29:06.204742 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383774bb-eff1-4d4c-a455-05bfd2ffd98c-serving-cert podName:383774bb-eff1-4d4c-a455-05bfd2ffd98c nodeName:}" failed. No retries permitted until 2026-01-29 03:29:06.704715683 +0000 UTC m=+100.188944588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/383774bb-eff1-4d4c-a455-05bfd2ffd98c-serving-cert") pod "cluster-version-operator-5c965bbfc6-b9bwr" (UID: "383774bb-eff1-4d4c-a455-05bfd2ffd98c") : failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:06 crc kubenswrapper[4707]: I0129 03:29:06.208291 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 29 03:29:06 crc kubenswrapper[4707]: I0129 03:29:06.227884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/383774bb-eff1-4d4c-a455-05bfd2ffd98c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:06 crc kubenswrapper[4707]: I0129 03:29:06.242521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:06 crc kubenswrapper[4707]: E0129 03:29:06.242706 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:06 crc kubenswrapper[4707]: I0129 03:29:06.401431 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 29 03:29:06 crc kubenswrapper[4707]: I0129 03:29:06.467586 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 29 03:29:06 crc kubenswrapper[4707]: I0129 03:29:06.722970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383774bb-eff1-4d4c-a455-05bfd2ffd98c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:06 crc kubenswrapper[4707]: I0129 03:29:06.726888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383774bb-eff1-4d4c-a455-05bfd2ffd98c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b9bwr\" (UID: \"383774bb-eff1-4d4c-a455-05bfd2ffd98c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:06 crc kubenswrapper[4707]: I0129 03:29:06.826464 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" Jan 29 03:29:06 crc kubenswrapper[4707]: I0129 03:29:06.955712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" event={"ID":"383774bb-eff1-4d4c-a455-05bfd2ffd98c","Type":"ContainerStarted","Data":"64c475730c00bc4be7ff4cbc327f2cf11b51ff1ab62bc93d929e57e5540f4b7c"} Jan 29 03:29:07 crc kubenswrapper[4707]: I0129 03:29:07.243502 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:07 crc kubenswrapper[4707]: E0129 03:29:07.245955 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:07 crc kubenswrapper[4707]: I0129 03:29:07.245981 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:07 crc kubenswrapper[4707]: E0129 03:29:07.246854 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:07 crc kubenswrapper[4707]: I0129 03:29:07.246286 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:07 crc kubenswrapper[4707]: E0129 03:29:07.247068 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:07 crc kubenswrapper[4707]: I0129 03:29:07.960491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" event={"ID":"383774bb-eff1-4d4c-a455-05bfd2ffd98c","Type":"ContainerStarted","Data":"3929d0bbd7249f5ba48c8b2dd3e766bfe4f86bd36ddeece81ede825d1030ce3b"} Jan 29 03:29:07 crc kubenswrapper[4707]: I0129 03:29:07.978052 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pf578" podStartSLOduration=75.978019473 podStartE2EDuration="1m15.978019473s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:05.451937892 +0000 UTC m=+98.936166797" watchObservedRunningTime="2026-01-29 03:29:07.978019473 +0000 UTC m=+101.462248408" Jan 29 03:29:07 crc kubenswrapper[4707]: I0129 03:29:07.978556 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9bwr" podStartSLOduration=75.978526199 podStartE2EDuration="1m15.978526199s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:07.977347863 +0000 UTC m=+101.461576778" watchObservedRunningTime="2026-01-29 03:29:07.978526199 +0000 UTC m=+101.462755114" Jan 29 03:29:08 crc kubenswrapper[4707]: I0129 03:29:08.243045 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:08 crc kubenswrapper[4707]: E0129 03:29:08.243265 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:09 crc kubenswrapper[4707]: I0129 03:29:09.243282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:09 crc kubenswrapper[4707]: I0129 03:29:09.243377 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:09 crc kubenswrapper[4707]: E0129 03:29:09.244388 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:09 crc kubenswrapper[4707]: I0129 03:29:09.243400 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:09 crc kubenswrapper[4707]: E0129 03:29:09.244590 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:09 crc kubenswrapper[4707]: E0129 03:29:09.244674 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:10 crc kubenswrapper[4707]: I0129 03:29:10.243564 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:10 crc kubenswrapper[4707]: E0129 03:29:10.243989 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:11 crc kubenswrapper[4707]: I0129 03:29:11.175295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:11 crc kubenswrapper[4707]: E0129 03:29:11.175557 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:29:11 crc kubenswrapper[4707]: E0129 03:29:11.175701 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs podName:08dd724c-b8cc-45c6-9a61-13643a1c0d75 nodeName:}" failed. No retries permitted until 2026-01-29 03:30:15.175672678 +0000 UTC m=+168.659901613 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs") pod "network-metrics-daemon-652c6" (UID: "08dd724c-b8cc-45c6-9a61-13643a1c0d75") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 03:29:11 crc kubenswrapper[4707]: I0129 03:29:11.242756 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:11 crc kubenswrapper[4707]: I0129 03:29:11.242837 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:11 crc kubenswrapper[4707]: E0129 03:29:11.242913 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:11 crc kubenswrapper[4707]: I0129 03:29:11.242837 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:11 crc kubenswrapper[4707]: E0129 03:29:11.243035 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:11 crc kubenswrapper[4707]: E0129 03:29:11.243250 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:12 crc kubenswrapper[4707]: I0129 03:29:12.243216 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:12 crc kubenswrapper[4707]: E0129 03:29:12.243411 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:13 crc kubenswrapper[4707]: I0129 03:29:13.242980 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:13 crc kubenswrapper[4707]: I0129 03:29:13.243356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:13 crc kubenswrapper[4707]: E0129 03:29:13.243354 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:13 crc kubenswrapper[4707]: I0129 03:29:13.243428 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:13 crc kubenswrapper[4707]: E0129 03:29:13.243734 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:13 crc kubenswrapper[4707]: E0129 03:29:13.243842 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:14 crc kubenswrapper[4707]: I0129 03:29:14.243103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:14 crc kubenswrapper[4707]: E0129 03:29:14.243300 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:15 crc kubenswrapper[4707]: I0129 03:29:15.242935 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:15 crc kubenswrapper[4707]: I0129 03:29:15.242948 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:15 crc kubenswrapper[4707]: E0129 03:29:15.243083 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:15 crc kubenswrapper[4707]: I0129 03:29:15.243441 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:15 crc kubenswrapper[4707]: E0129 03:29:15.243584 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:15 crc kubenswrapper[4707]: E0129 03:29:15.243771 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:15 crc kubenswrapper[4707]: I0129 03:29:15.243877 4707 scope.go:117] "RemoveContainer" containerID="bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d" Jan 29 03:29:15 crc kubenswrapper[4707]: E0129 03:29:15.244086 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nn7fm_openshift-ovn-kubernetes(f3eccef7-1d8e-42b5-b7c8-2cd378b7465a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" Jan 29 03:29:16 crc kubenswrapper[4707]: I0129 03:29:16.243099 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:16 crc kubenswrapper[4707]: E0129 03:29:16.243643 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:17 crc kubenswrapper[4707]: I0129 03:29:17.243704 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:17 crc kubenswrapper[4707]: I0129 03:29:17.243860 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:17 crc kubenswrapper[4707]: I0129 03:29:17.243911 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:17 crc kubenswrapper[4707]: E0129 03:29:17.245095 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:17 crc kubenswrapper[4707]: E0129 03:29:17.245308 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:17 crc kubenswrapper[4707]: E0129 03:29:17.245490 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:18 crc kubenswrapper[4707]: I0129 03:29:18.243633 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:18 crc kubenswrapper[4707]: E0129 03:29:18.243845 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:19 crc kubenswrapper[4707]: I0129 03:29:19.243049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:19 crc kubenswrapper[4707]: I0129 03:29:19.243147 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:19 crc kubenswrapper[4707]: E0129 03:29:19.243233 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:19 crc kubenswrapper[4707]: E0129 03:29:19.243339 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:19 crc kubenswrapper[4707]: I0129 03:29:19.243475 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:19 crc kubenswrapper[4707]: E0129 03:29:19.243592 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:20 crc kubenswrapper[4707]: I0129 03:29:20.243047 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:20 crc kubenswrapper[4707]: E0129 03:29:20.243848 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:21 crc kubenswrapper[4707]: I0129 03:29:21.242606 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:21 crc kubenswrapper[4707]: I0129 03:29:21.242764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:21 crc kubenswrapper[4707]: E0129 03:29:21.242924 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:21 crc kubenswrapper[4707]: I0129 03:29:21.243042 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:21 crc kubenswrapper[4707]: E0129 03:29:21.243271 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:21 crc kubenswrapper[4707]: E0129 03:29:21.243577 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:22 crc kubenswrapper[4707]: I0129 03:29:22.243631 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:22 crc kubenswrapper[4707]: E0129 03:29:22.243849 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:23 crc kubenswrapper[4707]: I0129 03:29:23.242648 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:23 crc kubenswrapper[4707]: I0129 03:29:23.242707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:23 crc kubenswrapper[4707]: I0129 03:29:23.242661 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:23 crc kubenswrapper[4707]: E0129 03:29:23.242823 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:23 crc kubenswrapper[4707]: E0129 03:29:23.243141 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:23 crc kubenswrapper[4707]: E0129 03:29:23.243403 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:24 crc kubenswrapper[4707]: I0129 03:29:24.242650 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:24 crc kubenswrapper[4707]: E0129 03:29:24.242812 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:25 crc kubenswrapper[4707]: I0129 03:29:25.243035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:25 crc kubenswrapper[4707]: I0129 03:29:25.243134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:25 crc kubenswrapper[4707]: I0129 03:29:25.243147 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:25 crc kubenswrapper[4707]: E0129 03:29:25.243259 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:25 crc kubenswrapper[4707]: E0129 03:29:25.243502 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:25 crc kubenswrapper[4707]: E0129 03:29:25.243736 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:26 crc kubenswrapper[4707]: I0129 03:29:26.243573 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:26 crc kubenswrapper[4707]: E0129 03:29:26.243782 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:27 crc kubenswrapper[4707]: I0129 03:29:27.034470 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/1.log" Jan 29 03:29:27 crc kubenswrapper[4707]: I0129 03:29:27.035242 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/0.log" Jan 29 03:29:27 crc kubenswrapper[4707]: I0129 03:29:27.035317 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd938209-46da-4f33-8496-23beb193ac96" containerID="8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965" exitCode=1 Jan 29 03:29:27 crc kubenswrapper[4707]: I0129 03:29:27.035373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vh9xt" event={"ID":"bd938209-46da-4f33-8496-23beb193ac96","Type":"ContainerDied","Data":"8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965"} Jan 29 03:29:27 crc kubenswrapper[4707]: I0129 03:29:27.035451 4707 scope.go:117] "RemoveContainer" containerID="7988fddf9f471cba0bc5b8490107b8e6ceaefd838210bf785827ca9400258391" Jan 29 03:29:27 crc kubenswrapper[4707]: I0129 03:29:27.036159 4707 scope.go:117] "RemoveContainer" containerID="8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965" Jan 29 03:29:27 crc kubenswrapper[4707]: E0129 03:29:27.036458 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-vh9xt_openshift-multus(bd938209-46da-4f33-8496-23beb193ac96)\"" pod="openshift-multus/multus-vh9xt" podUID="bd938209-46da-4f33-8496-23beb193ac96" Jan 29 03:29:27 crc kubenswrapper[4707]: E0129 03:29:27.221746 4707 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 29 03:29:27 crc kubenswrapper[4707]: I0129 03:29:27.242969 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:27 crc kubenswrapper[4707]: I0129 03:29:27.243036 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:27 crc kubenswrapper[4707]: I0129 03:29:27.242984 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:27 crc kubenswrapper[4707]: E0129 03:29:27.245268 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:27 crc kubenswrapper[4707]: E0129 03:29:27.245419 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:27 crc kubenswrapper[4707]: E0129 03:29:27.245638 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:27 crc kubenswrapper[4707]: E0129 03:29:27.366067 4707 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 29 03:29:28 crc kubenswrapper[4707]: I0129 03:29:28.048863 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/1.log" Jan 29 03:29:28 crc kubenswrapper[4707]: I0129 03:29:28.243204 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:28 crc kubenswrapper[4707]: E0129 03:29:28.243496 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:29 crc kubenswrapper[4707]: I0129 03:29:29.243356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:29 crc kubenswrapper[4707]: I0129 03:29:29.244303 4707 scope.go:117] "RemoveContainer" containerID="bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d" Jan 29 03:29:29 crc kubenswrapper[4707]: I0129 03:29:29.243476 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:29 crc kubenswrapper[4707]: E0129 03:29:29.244315 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:29 crc kubenswrapper[4707]: E0129 03:29:29.244472 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:29 crc kubenswrapper[4707]: I0129 03:29:29.243510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:29 crc kubenswrapper[4707]: E0129 03:29:29.244849 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:30 crc kubenswrapper[4707]: I0129 03:29:30.058898 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/3.log" Jan 29 03:29:30 crc kubenswrapper[4707]: I0129 03:29:30.062307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerStarted","Data":"e6ba46a3b002e4a00eceb67f566d451ea3eff8379adb7ccdc4ae2f8298abd464"} Jan 29 03:29:30 crc kubenswrapper[4707]: I0129 03:29:30.110742 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podStartSLOduration=98.110717904 podStartE2EDuration="1m38.110717904s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:30.109777826 +0000 UTC m=+123.594006731" watchObservedRunningTime="2026-01-29 03:29:30.110717904 +0000 UTC m=+123.594946809" Jan 29 03:29:30 crc kubenswrapper[4707]: I0129 03:29:30.239945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-652c6"] Jan 29 03:29:30 crc kubenswrapper[4707]: I0129 03:29:30.240126 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:30 crc kubenswrapper[4707]: E0129 03:29:30.240262 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:31 crc kubenswrapper[4707]: I0129 03:29:31.243265 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:31 crc kubenswrapper[4707]: I0129 03:29:31.243424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:31 crc kubenswrapper[4707]: E0129 03:29:31.243468 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:31 crc kubenswrapper[4707]: I0129 03:29:31.243566 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:31 crc kubenswrapper[4707]: E0129 03:29:31.243963 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:31 crc kubenswrapper[4707]: E0129 03:29:31.244144 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:32 crc kubenswrapper[4707]: I0129 03:29:32.243131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:32 crc kubenswrapper[4707]: E0129 03:29:32.243342 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:32 crc kubenswrapper[4707]: E0129 03:29:32.367867 4707 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 29 03:29:33 crc kubenswrapper[4707]: I0129 03:29:33.243261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:33 crc kubenswrapper[4707]: E0129 03:29:33.243447 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:33 crc kubenswrapper[4707]: I0129 03:29:33.243849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:33 crc kubenswrapper[4707]: E0129 03:29:33.244068 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:33 crc kubenswrapper[4707]: I0129 03:29:33.244177 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:33 crc kubenswrapper[4707]: E0129 03:29:33.244659 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:34 crc kubenswrapper[4707]: I0129 03:29:34.243615 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:34 crc kubenswrapper[4707]: E0129 03:29:34.244123 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:35 crc kubenswrapper[4707]: I0129 03:29:35.243594 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:35 crc kubenswrapper[4707]: I0129 03:29:35.243651 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:35 crc kubenswrapper[4707]: I0129 03:29:35.243713 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:35 crc kubenswrapper[4707]: E0129 03:29:35.243805 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:35 crc kubenswrapper[4707]: E0129 03:29:35.243947 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:35 crc kubenswrapper[4707]: E0129 03:29:35.244129 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:36 crc kubenswrapper[4707]: I0129 03:29:36.242574 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:36 crc kubenswrapper[4707]: E0129 03:29:36.242795 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:37 crc kubenswrapper[4707]: I0129 03:29:37.243636 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:37 crc kubenswrapper[4707]: I0129 03:29:37.243682 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:37 crc kubenswrapper[4707]: I0129 03:29:37.243765 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:37 crc kubenswrapper[4707]: E0129 03:29:37.245902 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:37 crc kubenswrapper[4707]: E0129 03:29:37.246075 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:37 crc kubenswrapper[4707]: E0129 03:29:37.246271 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:37 crc kubenswrapper[4707]: E0129 03:29:37.368634 4707 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 29 03:29:38 crc kubenswrapper[4707]: I0129 03:29:38.407414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:38 crc kubenswrapper[4707]: E0129 03:29:38.407639 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:38 crc kubenswrapper[4707]: I0129 03:29:38.412085 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:38 crc kubenswrapper[4707]: E0129 03:29:38.412308 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:39 crc kubenswrapper[4707]: I0129 03:29:39.242989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:39 crc kubenswrapper[4707]: E0129 03:29:39.243178 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:39 crc kubenswrapper[4707]: I0129 03:29:39.242983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:39 crc kubenswrapper[4707]: E0129 03:29:39.243583 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:40 crc kubenswrapper[4707]: I0129 03:29:40.242950 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:40 crc kubenswrapper[4707]: I0129 03:29:40.243008 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:40 crc kubenswrapper[4707]: E0129 03:29:40.243427 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:40 crc kubenswrapper[4707]: E0129 03:29:40.243610 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:40 crc kubenswrapper[4707]: I0129 03:29:40.243770 4707 scope.go:117] "RemoveContainer" containerID="8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965" Jan 29 03:29:40 crc kubenswrapper[4707]: I0129 03:29:40.424290 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/1.log" Jan 29 03:29:40 crc kubenswrapper[4707]: I0129 03:29:40.424394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vh9xt" event={"ID":"bd938209-46da-4f33-8496-23beb193ac96","Type":"ContainerStarted","Data":"5655bbc11ea24b509e78271170c1b3b66ff0b6788c59aa6680676258a96736b3"} Jan 29 03:29:41 crc kubenswrapper[4707]: I0129 03:29:41.243808 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:41 crc kubenswrapper[4707]: I0129 03:29:41.243893 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:41 crc kubenswrapper[4707]: E0129 03:29:41.244050 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 03:29:41 crc kubenswrapper[4707]: E0129 03:29:41.244292 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 03:29:42 crc kubenswrapper[4707]: I0129 03:29:42.243164 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:42 crc kubenswrapper[4707]: I0129 03:29:42.243337 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:42 crc kubenswrapper[4707]: E0129 03:29:42.243821 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 03:29:42 crc kubenswrapper[4707]: E0129 03:29:42.243998 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-652c6" podUID="08dd724c-b8cc-45c6-9a61-13643a1c0d75" Jan 29 03:29:43 crc kubenswrapper[4707]: I0129 03:29:43.242660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:43 crc kubenswrapper[4707]: I0129 03:29:43.242750 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:43 crc kubenswrapper[4707]: I0129 03:29:43.245327 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 29 03:29:43 crc kubenswrapper[4707]: I0129 03:29:43.248647 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 29 03:29:44 crc kubenswrapper[4707]: I0129 03:29:44.243235 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:29:44 crc kubenswrapper[4707]: I0129 03:29:44.243380 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:44 crc kubenswrapper[4707]: I0129 03:29:44.245879 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 29 03:29:44 crc kubenswrapper[4707]: I0129 03:29:44.246587 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 29 03:29:44 crc kubenswrapper[4707]: I0129 03:29:44.247194 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 29 03:29:44 crc kubenswrapper[4707]: I0129 03:29:44.250346 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.868634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.935080 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn"] Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.935782 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.938334 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-h88jj"] Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.939473 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.943103 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.945952 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.946032 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.946134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.948198 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kkzfz"] Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.949827 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.953004 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.966388 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mtt54"] Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.977461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.979366 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm"] Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.991572 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hp957"] Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.992062 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.992236 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.994400 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.995058 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 03:29:45 crc kubenswrapper[4707]: I0129 03:29:45.995626 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e542f9-e9bf-424e-9d2c-852baf887b17-serving-cert\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/52c7f103-7152-484c-aff9-da45a3f8ac20-node-pullsecrets\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-image-import-ca\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwzx4\" (UniqueName: \"kubernetes.io/projected/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-kube-api-access-kwzx4\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-config\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/52c7f103-7152-484c-aff9-da45a3f8ac20-encryption-config\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-serving-cert\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkwpq\" (UniqueName: \"kubernetes.io/projected/69e542f9-e9bf-424e-9d2c-852baf887b17-kube-api-access-tkwpq\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/52c7f103-7152-484c-aff9-da45a3f8ac20-etcd-client\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007427 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/52c7f103-7152-484c-aff9-da45a3f8ac20-audit-dir\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw7jd\" (UniqueName: \"kubernetes.io/projected/52c7f103-7152-484c-aff9-da45a3f8ac20-kube-api-access-jw7jd\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007482 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-etcd-serving-ca\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-client-ca\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007554 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-config\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52c7f103-7152-484c-aff9-da45a3f8ac20-serving-cert\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-client-ca\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-audit\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-config\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007884 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.007901 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.008287 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.008491 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.008924 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.009435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.009610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.009661 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.009917 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.010056 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.009917 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.010338 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.010348 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.010646 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.010884 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.010963 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.011134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.011276 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.011332 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.011470 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.013927 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.014165 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.014360 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.014868 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vt6tg"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.015385 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h9hdt"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.015761 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.016104 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.016463 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.016623 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.016695 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.016756 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.017303 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.017377 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.017619 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.018206 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.018280 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.018369 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.018605 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.018653 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.020488 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.024431 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cbr69"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.025200 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.025319 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.025955 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.026131 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.026974 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.028299 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.028412 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.028443 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.028581 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.036726 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.036939 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.037052 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-snpzw"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.045200 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-snpzw" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.047448 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z575b"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.048020 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rr7xr"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.048348 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.048648 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.049857 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.053379 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.053820 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7h7rj"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.054336 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.054648 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.054893 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.056129 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.056318 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.056395 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.056442 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.056653 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.056714 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.056782 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.056893 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.058846 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.058983 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.059060 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.059143 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.059264 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.059337 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.059401 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.059472 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.059569 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.060490 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.060694 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.060783 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.060864 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.060982 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.061066 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.061192 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.061276 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.061380 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.064344 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.056893 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.064632 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.089647 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.089660 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.064287 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.091733 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.094068 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.095837 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.096958 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.097864 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.099031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.113230 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.113413 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.129330 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.129812 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.129957 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130138 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130158 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.129986 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130315 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130488 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130550 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130319 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130579 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130490 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130818 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.130783 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.131220 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.131240 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.131405 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.132996 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.133258 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.133379 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.134289 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.134901 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135007 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-f7lpb"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135163 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135388 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135659 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-config\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135692 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52c7f103-7152-484c-aff9-da45a3f8ac20-serving-cert\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135722 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-client-ca\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-audit\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135809 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqvj8\" (UniqueName: \"kubernetes.io/projected/90329817-eb42-4c0b-8e57-908c60c1db50-kube-api-access-dqvj8\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135830 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-config\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135854 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e542f9-e9bf-424e-9d2c-852baf887b17-serving-cert\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf7f47e6-91e0-4084-b586-69208aba0921-trusted-ca\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.135914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-image-import-ca\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.136710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-audit\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.136833 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.137103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-config\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.138273 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.138473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-client-ca\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.139981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-config\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.140145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-image-import-ca\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfw2r\" (UniqueName: \"kubernetes.io/projected/cf7f47e6-91e0-4084-b586-69208aba0921-kube-api-access-vfw2r\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/52c7f103-7152-484c-aff9-da45a3f8ac20-node-pullsecrets\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143779 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwzx4\" (UniqueName: \"kubernetes.io/projected/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-kube-api-access-kwzx4\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143819 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-config\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf7f47e6-91e0-4084-b586-69208aba0921-serving-cert\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/52c7f103-7152-484c-aff9-da45a3f8ac20-encryption-config\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143918 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhsnv\" (UniqueName: \"kubernetes.io/projected/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-kube-api-access-dhsnv\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-serving-cert\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e542f9-e9bf-424e-9d2c-852baf887b17-serving-cert\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.143987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7f47e6-91e0-4084-b586-69208aba0921-config\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144044 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90329817-eb42-4c0b-8e57-908c60c1db50-config\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52c7f103-7152-484c-aff9-da45a3f8ac20-serving-cert\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/52c7f103-7152-484c-aff9-da45a3f8ac20-node-pullsecrets\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144190 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkwpq\" (UniqueName: \"kubernetes.io/projected/69e542f9-e9bf-424e-9d2c-852baf887b17-kube-api-access-tkwpq\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90329817-eb42-4c0b-8e57-908c60c1db50-serving-cert\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144233 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/52c7f103-7152-484c-aff9-da45a3f8ac20-etcd-client\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/52c7f103-7152-484c-aff9-da45a3f8ac20-audit-dir\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90329817-eb42-4c0b-8e57-908c60c1db50-service-ca-bundle\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw7jd\" (UniqueName: \"kubernetes.io/projected/52c7f103-7152-484c-aff9-da45a3f8ac20-kube-api-access-jw7jd\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-etcd-serving-ca\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144361 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90329817-eb42-4c0b-8e57-908c60c1db50-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.144396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-client-ca\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.145249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.145381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-client-ca\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.145413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/52c7f103-7152-484c-aff9-da45a3f8ac20-audit-dir\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.145845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-config\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.146305 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8bkh7"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.146626 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.146937 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.146962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.147389 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.147392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/52c7f103-7152-484c-aff9-da45a3f8ac20-etcd-serving-ca\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.147464 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.147504 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.148452 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.148883 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.148908 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.149014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-serving-cert\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.149167 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.149295 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.149650 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.150035 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.150100 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.150416 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.151130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.151205 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.151237 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.151135 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.151891 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.152229 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.152662 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.152832 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tpsmp"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.156059 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.159633 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.162263 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7p8hx"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.162499 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.162497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/52c7f103-7152-484c-aff9-da45a3f8ac20-encryption-config\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.162515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/52c7f103-7152-484c-aff9-da45a3f8ac20-etcd-client\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.162916 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k6sdj"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.163034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.163179 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.165468 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.167927 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mtt54"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.168429 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.171470 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-h88jj"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.172963 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sfkbk"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.179365 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.180232 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.180476 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.180632 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hp957"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.183913 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.189087 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vt6tg"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.189168 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.193223 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z575b"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.196869 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rr7xr"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.198293 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.198580 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h9hdt"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.199293 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-snpzw"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.200302 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.201373 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cbr69"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.202501 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.203673 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.204805 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.205898 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kkzfz"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.206991 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.208060 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4lzss"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.209072 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4lzss" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.209264 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7vfwn"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.210746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.212004 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.213572 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.216348 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.217989 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.218322 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.219618 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.220698 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.222188 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.223189 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.225746 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7h7rj"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.226781 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sfkbk"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.227817 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.228825 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.229923 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8bkh7"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.230993 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.232019 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4lzss"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.233029 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7p8hx"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.234054 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7vfwn"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.235060 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.236511 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tpsmp"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.237652 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k6sdj"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.237887 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.238665 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-z6nwq"] Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.239444 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf7f47e6-91e0-4084-b586-69208aba0921-serving-cert\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhsnv\" (UniqueName: \"kubernetes.io/projected/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-kube-api-access-dhsnv\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7f47e6-91e0-4084-b586-69208aba0921-config\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245373 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90329817-eb42-4c0b-8e57-908c60c1db50-config\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90329817-eb42-4c0b-8e57-908c60c1db50-serving-cert\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90329817-eb42-4c0b-8e57-908c60c1db50-service-ca-bundle\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90329817-eb42-4c0b-8e57-908c60c1db50-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqvj8\" (UniqueName: \"kubernetes.io/projected/90329817-eb42-4c0b-8e57-908c60c1db50-kube-api-access-dqvj8\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.245661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.246456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90329817-eb42-4c0b-8e57-908c60c1db50-service-ca-bundle\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.246977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf7f47e6-91e0-4084-b586-69208aba0921-config\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.247038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.247162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90329817-eb42-4c0b-8e57-908c60c1db50-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.247265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf7f47e6-91e0-4084-b586-69208aba0921-trusted-ca\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.247440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfw2r\" (UniqueName: \"kubernetes.io/projected/cf7f47e6-91e0-4084-b586-69208aba0921-kube-api-access-vfw2r\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.247960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90329817-eb42-4c0b-8e57-908c60c1db50-config\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.248492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90329817-eb42-4c0b-8e57-908c60c1db50-serving-cert\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.248507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf7f47e6-91e0-4084-b586-69208aba0921-trusted-ca\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.250248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf7f47e6-91e0-4084-b586-69208aba0921-serving-cert\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.251114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.259154 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.279301 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.299310 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.319221 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.340355 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.359179 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.398600 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.418562 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.438015 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.467001 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.479024 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.518776 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.539364 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.552449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/375c69a5-d957-4b05-a8b9-b241d63d52a6-etcd-service-ca\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.552578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09878060-fe42-40c3-b8b2-4392225b3669-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwwcm\" (UID: \"09878060-fe42-40c3-b8b2-4392225b3669\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.552622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.552690 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-oauth-serving-cert\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.552762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2911ab30-6b8e-4e2f-8bb2-2d7270d16c33-serving-cert\") pod \"openshift-config-operator-7777fb866f-cbr69\" (UID: \"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.552854 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5251706a-66cb-47ad-a87f-047d3d252bd6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.552900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5251706a-66cb-47ad-a87f-047d3d252bd6-encryption-config\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.552926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-bound-sa-token\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs72p\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-kube-api-access-hs72p\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/843bc0fd-4ef0-4f01-b1a1-b1281063a3dc-metrics-tls\") pod \"dns-operator-744455d44c-7h7rj\" (UID: \"843bc0fd-4ef0-4f01-b1a1-b1281063a3dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-trusted-ca-bundle\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-tls\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553258 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78wp\" (UniqueName: \"kubernetes.io/projected/c193fefb-9fdb-479f-851b-4f1fd4c9d087-kube-api-access-m78wp\") pod \"downloads-7954f5f757-snpzw\" (UID: \"c193fefb-9fdb-479f-851b-4f1fd4c9d087\") " pod="openshift-console/downloads-7954f5f757-snpzw" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwsdj\" (UniqueName: \"kubernetes.io/projected/3bd3ec93-fe2c-4c05-a521-75bd0886f729-kube-api-access-bwsdj\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553351 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhqvc\" (UniqueName: \"kubernetes.io/projected/2798ce2d-9125-464c-8001-03c3e9f65af7-kube-api-access-nhqvc\") pod \"openshift-apiserver-operator-796bbdcf4f-6t9xh\" (UID: \"2798ce2d-9125-464c-8001-03c3e9f65af7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-config\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3bd3ec93-fe2c-4c05-a521-75bd0886f729-machine-approver-tls\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrmlw\" (UniqueName: \"kubernetes.io/projected/5251706a-66cb-47ad-a87f-047d3d252bd6-kube-api-access-vrmlw\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553486 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmqb7\" (UniqueName: \"kubernetes.io/projected/23df2202-fce8-4515-b147-1256fe6d953b-kube-api-access-vmqb7\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23df2202-fce8-4515-b147-1256fe6d953b-config\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2798ce2d-9125-464c-8001-03c3e9f65af7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6t9xh\" (UID: \"2798ce2d-9125-464c-8001-03c3e9f65af7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553793 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5251706a-66cb-47ad-a87f-047d3d252bd6-etcd-client\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3bd3ec93-fe2c-4c05-a521-75bd0886f729-auth-proxy-config\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/375c69a5-d957-4b05-a8b9-b241d63d52a6-etcd-ca\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2798ce2d-9125-464c-8001-03c3e9f65af7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6t9xh\" (UID: \"2798ce2d-9125-464c-8001-03c3e9f65af7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsthm\" (UniqueName: \"kubernetes.io/projected/2911ab30-6b8e-4e2f-8bb2-2d7270d16c33-kube-api-access-xsthm\") pod \"openshift-config-operator-7777fb866f-cbr69\" (UID: \"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375c69a5-d957-4b05-a8b9-b241d63d52a6-config\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.553994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5251706a-66cb-47ad-a87f-047d3d252bd6-serving-cert\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-oauth-config\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: E0129 03:29:46.554079 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.054059937 +0000 UTC m=+140.538288852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-certificates\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5251706a-66cb-47ad-a87f-047d3d252bd6-audit-dir\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6bpc\" (UniqueName: \"kubernetes.io/projected/09878060-fe42-40c3-b8b2-4392225b3669-kube-api-access-d6bpc\") pod \"cluster-samples-operator-665b6dd947-gwwcm\" (UID: \"09878060-fe42-40c3-b8b2-4392225b3669\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0980608d-c995-4e76-b03b-f486a390ab5b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-khm86\" (UID: \"0980608d-c995-4e76-b03b-f486a390ab5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554397 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5251706a-66cb-47ad-a87f-047d3d252bd6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/23df2202-fce8-4515-b147-1256fe6d953b-images\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsfwg\" (UniqueName: \"kubernetes.io/projected/53bfd3ca-7447-44cf-af4c-165db1f5e7be-kube-api-access-wsfwg\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554635 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/375c69a5-d957-4b05-a8b9-b241d63d52a6-serving-cert\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554675 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0980608d-c995-4e76-b03b-f486a390ab5b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-khm86\" (UID: \"0980608d-c995-4e76-b03b-f486a390ab5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djg29\" (UniqueName: \"kubernetes.io/projected/843bc0fd-4ef0-4f01-b1a1-b1281063a3dc-kube-api-access-djg29\") pod \"dns-operator-744455d44c-7h7rj\" (UID: \"843bc0fd-4ef0-4f01-b1a1-b1281063a3dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/23df2202-fce8-4515-b147-1256fe6d953b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-service-ca\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd3ec93-fe2c-4c05-a521-75bd0886f729-config\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.554975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-trusted-ca\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.555016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9z9j\" (UniqueName: \"kubernetes.io/projected/375c69a5-d957-4b05-a8b9-b241d63d52a6-kube-api-access-q9z9j\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.555104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-serving-cert\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.555144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2911ab30-6b8e-4e2f-8bb2-2d7270d16c33-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cbr69\" (UID: \"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.555230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/375c69a5-d957-4b05-a8b9-b241d63d52a6-etcd-client\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.555263 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5fcn\" (UniqueName: \"kubernetes.io/projected/0980608d-c995-4e76-b03b-f486a390ab5b-kube-api-access-c5fcn\") pod \"kube-storage-version-migrator-operator-b67b599dd-khm86\" (UID: \"0980608d-c995-4e76-b03b-f486a390ab5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.555297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5251706a-66cb-47ad-a87f-047d3d252bd6-audit-policies\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.558817 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.579520 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.599321 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.646346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwzx4\" (UniqueName: \"kubernetes.io/projected/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-kube-api-access-kwzx4\") pod \"route-controller-manager-6576b87f9c-mbczn\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.656106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.656340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: E0129 03:29:46.656430 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.156395591 +0000 UTC m=+140.640624536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.656767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.656859 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.656902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rckmd\" (UniqueName: \"kubernetes.io/projected/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-kube-api-access-rckmd\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.656952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-oauth-serving-cert\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.656986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2523fe-5501-417e-9d8b-85e936ed840c-config-volume\") pod \"collect-profiles-29494275-xzvdh\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657018 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c540dd44-ac44-44c6-8e9c-6e1d1890444e-default-certificate\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/605d630d-8166-4b8e-8594-80ed781b8b9d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hhlzr\" (UID: \"605d630d-8166-4b8e-8594-80ed781b8b9d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9tjzd\" (UID: \"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xph46\" (UniqueName: \"kubernetes.io/projected/74f4c17e-e02b-4971-9341-db1810cb5192-kube-api-access-xph46\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-proxy-tls\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657224 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/48412975-a793-44ee-a22c-dc7ad4145451-signing-key\") pod \"service-ca-9c57cc56f-7p8hx\" (UID: \"48412975-a793-44ee-a22c-dc7ad4145451\") " pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqdkk\" (UniqueName: \"kubernetes.io/projected/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-kube-api-access-gqdkk\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a45220-2ae6-4b62-80ae-8c82833390a4-config\") pod \"kube-apiserver-operator-766d6c64bb-srpcb\" (UID: \"32a45220-2ae6-4b62-80ae-8c82833390a4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657337 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa8df0a1-21ab-42b5-92fe-a444e09a0416-webhook-cert\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbjpd\" (UniqueName: \"kubernetes.io/projected/f5aa9908-9832-4df9-b0b3-5c4e466fac2f-kube-api-access-rbjpd\") pod \"olm-operator-6b444d44fb-b9hgf\" (UID: \"f5aa9908-9832-4df9-b0b3-5c4e466fac2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxjqw\" (UniqueName: \"kubernetes.io/projected/9194d298-b1b5-4b06-9254-b484dc1a1382-kube-api-access-jxjqw\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-node-bootstrap-token\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-dir\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/48412975-a793-44ee-a22c-dc7ad4145451-signing-cabundle\") pod \"service-ca-9c57cc56f-7p8hx\" (UID: \"48412975-a793-44ee-a22c-dc7ad4145451\") " pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs72p\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-kube-api-access-hs72p\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m78wp\" (UniqueName: \"kubernetes.io/projected/c193fefb-9fdb-479f-851b-4f1fd4c9d087-kube-api-access-m78wp\") pod \"downloads-7954f5f757-snpzw\" (UID: \"c193fefb-9fdb-479f-851b-4f1fd4c9d087\") " pod="openshift-console/downloads-7954f5f757-snpzw" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657840 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfqj8\" (UniqueName: \"kubernetes.io/projected/94e68ae3-2da0-4fd9-a745-840760dd4efe-kube-api-access-zfqj8\") pod \"catalog-operator-68c6474976-57mk4\" (UID: \"94e68ae3-2da0-4fd9-a745-840760dd4efe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657908 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32a45220-2ae6-4b62-80ae-8c82833390a4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-srpcb\" (UID: \"32a45220-2ae6-4b62-80ae-8c82833390a4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.657997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhqvc\" (UniqueName: \"kubernetes.io/projected/2798ce2d-9125-464c-8001-03c3e9f65af7-kube-api-access-nhqvc\") pod \"openshift-apiserver-operator-796bbdcf4f-6t9xh\" (UID: \"2798ce2d-9125-464c-8001-03c3e9f65af7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658156 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b20b69c3-e242-41ec-b726-c39c4338f7ba-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7v2mk\" (UID: \"b20b69c3-e242-41ec-b726-c39c4338f7ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrmlw\" (UniqueName: \"kubernetes.io/projected/5251706a-66cb-47ad-a87f-047d3d252bd6-kube-api-access-vrmlw\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3bd3ec93-fe2c-4c05-a521-75bd0886f729-machine-approver-tls\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658326 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f5aa9908-9832-4df9-b0b3-5c4e466fac2f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-b9hgf\" (UID: \"f5aa9908-9832-4df9-b0b3-5c4e466fac2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-oauth-serving-cert\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-policies\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2798ce2d-9125-464c-8001-03c3e9f65af7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6t9xh\" (UID: \"2798ce2d-9125-464c-8001-03c3e9f65af7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5251706a-66cb-47ad-a87f-047d3d252bd6-etcd-client\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658582 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74f4c17e-e02b-4971-9341-db1810cb5192-metrics-tls\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d8e31df-9566-48bf-9488-dfe398d5bb12-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jr7sf\" (UID: \"2d8e31df-9566-48bf-9488-dfe398d5bb12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/daef8de5-6c32-460e-8830-884671338aca-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8bkh7\" (UID: \"daef8de5-6c32-460e-8830-884671338aca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658769 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsthm\" (UniqueName: \"kubernetes.io/projected/2911ab30-6b8e-4e2f-8bb2-2d7270d16c33-kube-api-access-xsthm\") pod \"openshift-config-operator-7777fb866f-cbr69\" (UID: \"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2798ce2d-9125-464c-8001-03c3e9f65af7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6t9xh\" (UID: \"2798ce2d-9125-464c-8001-03c3e9f65af7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375c69a5-d957-4b05-a8b9-b241d63d52a6-config\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.658982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-oauth-config\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv924\" (UniqueName: \"kubernetes.io/projected/605d630d-8166-4b8e-8594-80ed781b8b9d-kube-api-access-jv924\") pod \"machine-config-controller-84d6567774-hhlzr\" (UID: \"605d630d-8166-4b8e-8594-80ed781b8b9d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5251706a-66cb-47ad-a87f-047d3d252bd6-audit-dir\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659140 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5251706a-66cb-47ad-a87f-047d3d252bd6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32a45220-2ae6-4b62-80ae-8c82833390a4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-srpcb\" (UID: \"32a45220-2ae6-4b62-80ae-8c82833390a4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4afbbde5-74e6-4ce1-bcc1-8d517069e49b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f785m\" (UID: \"4afbbde5-74e6-4ce1-bcc1-8d517069e49b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/23df2202-fce8-4515-b147-1256fe6d953b-images\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/375c69a5-d957-4b05-a8b9-b241d63d52a6-serving-cert\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djg29\" (UniqueName: \"kubernetes.io/projected/843bc0fd-4ef0-4f01-b1a1-b1281063a3dc-kube-api-access-djg29\") pod \"dns-operator-744455d44c-7h7rj\" (UID: \"843bc0fd-4ef0-4f01-b1a1-b1281063a3dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.659686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5251706a-66cb-47ad-a87f-047d3d252bd6-audit-dir\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.660588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375c69a5-d957-4b05-a8b9-b241d63d52a6-config\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.660918 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a06628-39ad-42db-8dec-259a64cc9947-config\") pod \"service-ca-operator-777779d784-qwgsz\" (UID: \"f7a06628-39ad-42db-8dec-259a64cc9947\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.660989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-service-ca\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.661028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd3ec93-fe2c-4c05-a521-75bd0886f729-config\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.661359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw7jd\" (UniqueName: \"kubernetes.io/projected/52c7f103-7152-484c-aff9-da45a3f8ac20-kube-api-access-jw7jd\") pod \"apiserver-76f77b778f-kkzfz\" (UID: \"52c7f103-7152-484c-aff9-da45a3f8ac20\") " pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.661754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fdst\" (UniqueName: \"kubernetes.io/projected/96781d63-5e82-4529-9051-c3b5c9a8175f-kube-api-access-4fdst\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.661818 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4xj6\" (UniqueName: \"kubernetes.io/projected/4afbbde5-74e6-4ce1-bcc1-8d517069e49b-kube-api-access-p4xj6\") pod \"package-server-manager-789f6589d5-f785m\" (UID: \"4afbbde5-74e6-4ce1-bcc1-8d517069e49b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.661851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/23df2202-fce8-4515-b147-1256fe6d953b-images\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.661878 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-trusted-ca\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.661917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-serving-cert\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.661951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/94e68ae3-2da0-4fd9-a745-840760dd4efe-srv-cert\") pod \"catalog-operator-68c6474976-57mk4\" (UID: \"94e68ae3-2da0-4fd9-a745-840760dd4efe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.661986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-csi-data-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662018 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20b69c3-e242-41ec-b726-c39c4338f7ba-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7v2mk\" (UID: \"b20b69c3-e242-41ec-b726-c39c4338f7ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2911ab30-6b8e-4e2f-8bb2-2d7270d16c33-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cbr69\" (UID: \"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c540dd44-ac44-44c6-8e9c-6e1d1890444e-metrics-certs\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-config-volume\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkfhf\" (UniqueName: \"kubernetes.io/projected/c540dd44-ac44-44c6-8e9c-6e1d1890444e-kube-api-access-wkfhf\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/375c69a5-d957-4b05-a8b9-b241d63d52a6-etcd-service-ca\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09878060-fe42-40c3-b8b2-4392225b3669-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwwcm\" (UID: \"09878060-fe42-40c3-b8b2-4392225b3669\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2911ab30-6b8e-4e2f-8bb2-2d7270d16c33-serving-cert\") pod \"openshift-config-operator-7777fb866f-cbr69\" (UID: \"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662458 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd3ec93-fe2c-4c05-a521-75bd0886f729-config\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-service-ca\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5251706a-66cb-47ad-a87f-047d3d252bd6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.662719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74f4c17e-e02b-4971-9341-db1810cb5192-bound-sa-token\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2911ab30-6b8e-4e2f-8bb2-2d7270d16c33-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cbr69\" (UID: \"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgxl5\" (UniqueName: \"kubernetes.io/projected/aa8df0a1-21ab-42b5-92fe-a444e09a0416-kube-api-access-cgxl5\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2798ce2d-9125-464c-8001-03c3e9f65af7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6t9xh\" (UID: \"2798ce2d-9125-464c-8001-03c3e9f65af7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs2n5\" (UniqueName: \"kubernetes.io/projected/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-kube-api-access-rs2n5\") pod \"marketplace-operator-79b997595-tpsmp\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663718 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpbks\" (UniqueName: \"kubernetes.io/projected/be02bcec-7c1e-4c86-904d-41738c0af270-kube-api-access-wpbks\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5251706a-66cb-47ad-a87f-047d3d252bd6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5251706a-66cb-47ad-a87f-047d3d252bd6-encryption-config\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-bound-sa-token\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.663999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d39d4e48-a5a8-44ff-a1b3-9751f59bdffe-cert\") pod \"ingress-canary-4lzss\" (UID: \"d39d4e48-a5a8-44ff-a1b3-9751f59bdffe\") " pod="openshift-ingress-canary/ingress-canary-4lzss" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrlx\" (UniqueName: \"kubernetes.io/projected/48412975-a793-44ee-a22c-dc7ad4145451-kube-api-access-fxrlx\") pod \"service-ca-9c57cc56f-7p8hx\" (UID: \"48412975-a793-44ee-a22c-dc7ad4145451\") " pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/843bc0fd-4ef0-4f01-b1a1-b1281063a3dc-metrics-tls\") pod \"dns-operator-744455d44c-7h7rj\" (UID: \"843bc0fd-4ef0-4f01-b1a1-b1281063a3dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-trusted-ca-bundle\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-registration-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-oauth-config\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/375c69a5-d957-4b05-a8b9-b241d63d52a6-etcd-service-ca\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74f4c17e-e02b-4971-9341-db1810cb5192-trusted-ca\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tpsmp\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-trusted-ca\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-metrics-tls\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.664950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-tls\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwsdj\" (UniqueName: \"kubernetes.io/projected/3bd3ec93-fe2c-4c05-a521-75bd0886f729-kube-api-access-bwsdj\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-config\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2523fe-5501-417e-9d8b-85e936ed840c-secret-volume\") pod \"collect-profiles-29494275-xzvdh\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5251706a-66cb-47ad-a87f-047d3d252bd6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8275a2b-4124-46d8-b2f1-4a7e8401e369-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bzqlt\" (UID: \"e8275a2b-4124-46d8-b2f1-4a7e8401e369\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7a06628-39ad-42db-8dec-259a64cc9947-serving-cert\") pod \"service-ca-operator-777779d784-qwgsz\" (UID: \"f7a06628-39ad-42db-8dec-259a64cc9947\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-trusted-ca-bundle\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlxkj\" (UniqueName: \"kubernetes.io/projected/daef8de5-6c32-460e-8830-884671338aca-kube-api-access-tlxkj\") pod \"multus-admission-controller-857f4d67dd-8bkh7\" (UID: \"daef8de5-6c32-460e-8830-884671338aca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" Jan 29 03:29:46 crc kubenswrapper[4707]: E0129 03:29:46.665806 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.165774092 +0000 UTC m=+140.650003087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.665955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmqb7\" (UniqueName: \"kubernetes.io/projected/23df2202-fce8-4515-b147-1256fe6d953b-kube-api-access-vmqb7\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-mountpoint-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9tjzd\" (UID: \"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23df2202-fce8-4515-b147-1256fe6d953b-config\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/94e68ae3-2da0-4fd9-a745-840760dd4efe-profile-collector-cert\") pod \"catalog-operator-68c6474976-57mk4\" (UID: \"94e68ae3-2da0-4fd9-a745-840760dd4efe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bqsv\" (UniqueName: \"kubernetes.io/projected/b20b69c3-e242-41ec-b726-c39c4338f7ba-kube-api-access-9bqsv\") pod \"openshift-controller-manager-operator-756b6f6bc6-7v2mk\" (UID: \"b20b69c3-e242-41ec-b726-c39c4338f7ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/605d630d-8166-4b8e-8594-80ed781b8b9d-proxy-tls\") pod \"machine-config-controller-84d6567774-hhlzr\" (UID: \"605d630d-8166-4b8e-8594-80ed781b8b9d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c540dd44-ac44-44c6-8e9c-6e1d1890444e-stats-auth\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8e31df-9566-48bf-9488-dfe398d5bb12-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jr7sf\" (UID: \"2d8e31df-9566-48bf-9488-dfe398d5bb12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3bd3ec93-fe2c-4c05-a521-75bd0886f729-auth-proxy-config\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/375c69a5-d957-4b05-a8b9-b241d63d52a6-etcd-ca\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb-config\") pod \"kube-controller-manager-operator-78b949d7b-9tjzd\" (UID: \"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5hzm\" (UniqueName: \"kubernetes.io/projected/aa6f96d8-64db-4d31-b07f-a933cca2d98f-kube-api-access-n5hzm\") pod \"migrator-59844c95c7-fzdml\" (UID: \"aa6f96d8-64db-4d31-b07f-a933cca2d98f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5251706a-66cb-47ad-a87f-047d3d252bd6-serving-cert\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9ndf\" (UniqueName: \"kubernetes.io/projected/8a2523fe-5501-417e-9d8b-85e936ed840c-kube-api-access-m9ndf\") pod \"collect-profiles-29494275-xzvdh\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-certs\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6bpc\" (UniqueName: \"kubernetes.io/projected/09878060-fe42-40c3-b8b2-4392225b3669-kube-api-access-d6bpc\") pod \"cluster-samples-operator-665b6dd947-gwwcm\" (UID: \"09878060-fe42-40c3-b8b2-4392225b3669\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.666988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-certificates\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tpsmp\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0980608d-c995-4e76-b03b-f486a390ab5b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-khm86\" (UID: \"0980608d-c995-4e76-b03b-f486a390ab5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm9lt\" (UniqueName: \"kubernetes.io/projected/e8275a2b-4124-46d8-b2f1-4a7e8401e369-kube-api-access-rm9lt\") pod \"control-plane-machine-set-operator-78cbb6b69f-bzqlt\" (UID: \"e8275a2b-4124-46d8-b2f1-4a7e8401e369\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0980608d-c995-4e76-b03b-f486a390ab5b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-khm86\" (UID: \"0980608d-c995-4e76-b03b-f486a390ab5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsfwg\" (UniqueName: \"kubernetes.io/projected/53bfd3ca-7447-44cf-af4c-165db1f5e7be-kube-api-access-wsfwg\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aa8df0a1-21ab-42b5-92fe-a444e09a0416-tmpfs\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-plugins-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/23df2202-fce8-4515-b147-1256fe6d953b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c540dd44-ac44-44c6-8e9c-6e1d1890444e-service-ca-bundle\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgld6\" (UniqueName: \"kubernetes.io/projected/f7a06628-39ad-42db-8dec-259a64cc9947-kube-api-access-sgld6\") pod \"service-ca-operator-777779d784-qwgsz\" (UID: \"f7a06628-39ad-42db-8dec-259a64cc9947\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa8df0a1-21ab-42b5-92fe-a444e09a0416-apiservice-cert\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9z9j\" (UniqueName: \"kubernetes.io/projected/375c69a5-d957-4b05-a8b9-b241d63d52a6-kube-api-access-q9z9j\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667822 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr2jk\" (UniqueName: \"kubernetes.io/projected/d39d4e48-a5a8-44ff-a1b3-9751f59bdffe-kube-api-access-gr2jk\") pod \"ingress-canary-4lzss\" (UID: \"d39d4e48-a5a8-44ff-a1b3-9751f59bdffe\") " pod="openshift-ingress-canary/ingress-canary-4lzss" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667870 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8e31df-9566-48bf-9488-dfe398d5bb12-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jr7sf\" (UID: \"2d8e31df-9566-48bf-9488-dfe398d5bb12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/375c69a5-d957-4b05-a8b9-b241d63d52a6-etcd-client\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.667985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5fcn\" (UniqueName: \"kubernetes.io/projected/0980608d-c995-4e76-b03b-f486a390ab5b-kube-api-access-c5fcn\") pod \"kube-storage-version-migrator-operator-b67b599dd-khm86\" (UID: \"0980608d-c995-4e76-b03b-f486a390ab5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.668030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5251706a-66cb-47ad-a87f-047d3d252bd6-audit-policies\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.668108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.668156 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-socket-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.668198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-images\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.668273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f5aa9908-9832-4df9-b0b3-5c4e466fac2f-srv-cert\") pod \"olm-operator-6b444d44fb-b9hgf\" (UID: \"f5aa9908-9832-4df9-b0b3-5c4e466fac2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.668319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.668445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3bd3ec93-fe2c-4c05-a521-75bd0886f729-auth-proxy-config\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.669453 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/375c69a5-d957-4b05-a8b9-b241d63d52a6-etcd-ca\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.671147 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-certificates\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.672411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-config\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.672827 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5251706a-66cb-47ad-a87f-047d3d252bd6-audit-policies\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.674484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0980608d-c995-4e76-b03b-f486a390ab5b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-khm86\" (UID: \"0980608d-c995-4e76-b03b-f486a390ab5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.674792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.677439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/843bc0fd-4ef0-4f01-b1a1-b1281063a3dc-metrics-tls\") pod \"dns-operator-744455d44c-7h7rj\" (UID: \"843bc0fd-4ef0-4f01-b1a1-b1281063a3dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.678896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23df2202-fce8-4515-b147-1256fe6d953b-config\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.680725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09878060-fe42-40c3-b8b2-4392225b3669-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwwcm\" (UID: \"09878060-fe42-40c3-b8b2-4392225b3669\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.680846 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.682260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0980608d-c995-4e76-b03b-f486a390ab5b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-khm86\" (UID: \"0980608d-c995-4e76-b03b-f486a390ab5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.682956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/375c69a5-d957-4b05-a8b9-b241d63d52a6-serving-cert\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.683164 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/23df2202-fce8-4515-b147-1256fe6d953b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.683250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3bd3ec93-fe2c-4c05-a521-75bd0886f729-machine-approver-tls\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.683344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2911ab30-6b8e-4e2f-8bb2-2d7270d16c33-serving-cert\") pod \"openshift-config-operator-7777fb866f-cbr69\" (UID: \"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.683889 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5251706a-66cb-47ad-a87f-047d3d252bd6-etcd-client\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.684484 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-serving-cert\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.686219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2798ce2d-9125-464c-8001-03c3e9f65af7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6t9xh\" (UID: \"2798ce2d-9125-464c-8001-03c3e9f65af7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.687717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5251706a-66cb-47ad-a87f-047d3d252bd6-encryption-config\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.688885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/375c69a5-d957-4b05-a8b9-b241d63d52a6-etcd-client\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.689159 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-tls\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.690074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5251706a-66cb-47ad-a87f-047d3d252bd6-serving-cert\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.690411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkwpq\" (UniqueName: \"kubernetes.io/projected/69e542f9-e9bf-424e-9d2c-852baf887b17-kube-api-access-tkwpq\") pod \"controller-manager-879f6c89f-h88jj\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.698909 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.719482 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.738917 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.758777 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.769818 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74f4c17e-e02b-4971-9341-db1810cb5192-metrics-tls\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d8e31df-9566-48bf-9488-dfe398d5bb12-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jr7sf\" (UID: \"2d8e31df-9566-48bf-9488-dfe398d5bb12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:46 crc kubenswrapper[4707]: E0129 03:29:46.770228 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.270193628 +0000 UTC m=+140.754422573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/daef8de5-6c32-460e-8830-884671338aca-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8bkh7\" (UID: \"daef8de5-6c32-460e-8830-884671338aca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv924\" (UniqueName: \"kubernetes.io/projected/605d630d-8166-4b8e-8594-80ed781b8b9d-kube-api-access-jv924\") pod \"machine-config-controller-84d6567774-hhlzr\" (UID: \"605d630d-8166-4b8e-8594-80ed781b8b9d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32a45220-2ae6-4b62-80ae-8c82833390a4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-srpcb\" (UID: \"32a45220-2ae6-4b62-80ae-8c82833390a4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4afbbde5-74e6-4ce1-bcc1-8d517069e49b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f785m\" (UID: \"4afbbde5-74e6-4ce1-bcc1-8d517069e49b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a06628-39ad-42db-8dec-259a64cc9947-config\") pod \"service-ca-operator-777779d784-qwgsz\" (UID: \"f7a06628-39ad-42db-8dec-259a64cc9947\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fdst\" (UniqueName: \"kubernetes.io/projected/96781d63-5e82-4529-9051-c3b5c9a8175f-kube-api-access-4fdst\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4xj6\" (UniqueName: \"kubernetes.io/projected/4afbbde5-74e6-4ce1-bcc1-8d517069e49b-kube-api-access-p4xj6\") pod \"package-server-manager-789f6589d5-f785m\" (UID: \"4afbbde5-74e6-4ce1-bcc1-8d517069e49b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/94e68ae3-2da0-4fd9-a745-840760dd4efe-srv-cert\") pod \"catalog-operator-68c6474976-57mk4\" (UID: \"94e68ae3-2da0-4fd9-a745-840760dd4efe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-csi-data-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20b69c3-e242-41ec-b726-c39c4338f7ba-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7v2mk\" (UID: \"b20b69c3-e242-41ec-b726-c39c4338f7ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.770998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c540dd44-ac44-44c6-8e9c-6e1d1890444e-metrics-certs\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-config-volume\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkfhf\" (UniqueName: \"kubernetes.io/projected/c540dd44-ac44-44c6-8e9c-6e1d1890444e-kube-api-access-wkfhf\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74f4c17e-e02b-4971-9341-db1810cb5192-bound-sa-token\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-csi-data-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgxl5\" (UniqueName: \"kubernetes.io/projected/aa8df0a1-21ab-42b5-92fe-a444e09a0416-kube-api-access-cgxl5\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs2n5\" (UniqueName: \"kubernetes.io/projected/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-kube-api-access-rs2n5\") pod \"marketplace-operator-79b997595-tpsmp\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771283 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpbks\" (UniqueName: \"kubernetes.io/projected/be02bcec-7c1e-4c86-904d-41738c0af270-kube-api-access-wpbks\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d39d4e48-a5a8-44ff-a1b3-9751f59bdffe-cert\") pod \"ingress-canary-4lzss\" (UID: \"d39d4e48-a5a8-44ff-a1b3-9751f59bdffe\") " pod="openshift-ingress-canary/ingress-canary-4lzss" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrlx\" (UniqueName: \"kubernetes.io/projected/48412975-a793-44ee-a22c-dc7ad4145451-kube-api-access-fxrlx\") pod \"service-ca-9c57cc56f-7p8hx\" (UID: \"48412975-a793-44ee-a22c-dc7ad4145451\") " pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-registration-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74f4c17e-e02b-4971-9341-db1810cb5192-trusted-ca\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tpsmp\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-metrics-tls\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2523fe-5501-417e-9d8b-85e936ed840c-secret-volume\") pod \"collect-profiles-29494275-xzvdh\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8275a2b-4124-46d8-b2f1-4a7e8401e369-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bzqlt\" (UID: \"e8275a2b-4124-46d8-b2f1-4a7e8401e369\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7a06628-39ad-42db-8dec-259a64cc9947-serving-cert\") pod \"service-ca-operator-777779d784-qwgsz\" (UID: \"f7a06628-39ad-42db-8dec-259a64cc9947\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.771820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlxkj\" (UniqueName: \"kubernetes.io/projected/daef8de5-6c32-460e-8830-884671338aca-kube-api-access-tlxkj\") pod \"multus-admission-controller-857f4d67dd-8bkh7\" (UID: \"daef8de5-6c32-460e-8830-884671338aca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.772313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20b69c3-e242-41ec-b726-c39c4338f7ba-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7v2mk\" (UID: \"b20b69c3-e242-41ec-b726-c39c4338f7ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.772343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-registration-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: E0129 03:29:46.772525 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.272503037 +0000 UTC m=+140.756731992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.772968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-mountpoint-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.773116 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-mountpoint-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.773204 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9tjzd\" (UID: \"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.773984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/94e68ae3-2da0-4fd9-a745-840760dd4efe-profile-collector-cert\") pod \"catalog-operator-68c6474976-57mk4\" (UID: \"94e68ae3-2da0-4fd9-a745-840760dd4efe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.774226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775250 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74f4c17e-e02b-4971-9341-db1810cb5192-trusted-ca\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bqsv\" (UniqueName: \"kubernetes.io/projected/b20b69c3-e242-41ec-b726-c39c4338f7ba-kube-api-access-9bqsv\") pod \"openshift-controller-manager-operator-756b6f6bc6-7v2mk\" (UID: \"b20b69c3-e242-41ec-b726-c39c4338f7ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/605d630d-8166-4b8e-8594-80ed781b8b9d-proxy-tls\") pod \"machine-config-controller-84d6567774-hhlzr\" (UID: \"605d630d-8166-4b8e-8594-80ed781b8b9d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c540dd44-ac44-44c6-8e9c-6e1d1890444e-stats-auth\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8e31df-9566-48bf-9488-dfe398d5bb12-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jr7sf\" (UID: \"2d8e31df-9566-48bf-9488-dfe398d5bb12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb-config\") pod \"kube-controller-manager-operator-78b949d7b-9tjzd\" (UID: \"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5hzm\" (UniqueName: \"kubernetes.io/projected/aa6f96d8-64db-4d31-b07f-a933cca2d98f-kube-api-access-n5hzm\") pod \"migrator-59844c95c7-fzdml\" (UID: \"aa6f96d8-64db-4d31-b07f-a933cca2d98f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9ndf\" (UniqueName: \"kubernetes.io/projected/8a2523fe-5501-417e-9d8b-85e936ed840c-kube-api-access-m9ndf\") pod \"collect-profiles-29494275-xzvdh\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-certs\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tpsmp\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm9lt\" (UniqueName: \"kubernetes.io/projected/e8275a2b-4124-46d8-b2f1-4a7e8401e369-kube-api-access-rm9lt\") pod \"control-plane-machine-set-operator-78cbb6b69f-bzqlt\" (UID: \"e8275a2b-4124-46d8-b2f1-4a7e8401e369\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775913 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aa8df0a1-21ab-42b5-92fe-a444e09a0416-tmpfs\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775948 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-plugins-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.775996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c540dd44-ac44-44c6-8e9c-6e1d1890444e-service-ca-bundle\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgld6\" (UniqueName: \"kubernetes.io/projected/f7a06628-39ad-42db-8dec-259a64cc9947-kube-api-access-sgld6\") pod \"service-ca-operator-777779d784-qwgsz\" (UID: \"f7a06628-39ad-42db-8dec-259a64cc9947\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74f4c17e-e02b-4971-9341-db1810cb5192-metrics-tls\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776211 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa8df0a1-21ab-42b5-92fe-a444e09a0416-apiservice-cert\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776260 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr2jk\" (UniqueName: \"kubernetes.io/projected/d39d4e48-a5a8-44ff-a1b3-9751f59bdffe-kube-api-access-gr2jk\") pod \"ingress-canary-4lzss\" (UID: \"d39d4e48-a5a8-44ff-a1b3-9751f59bdffe\") " pod="openshift-ingress-canary/ingress-canary-4lzss" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8e31df-9566-48bf-9488-dfe398d5bb12-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jr7sf\" (UID: \"2d8e31df-9566-48bf-9488-dfe398d5bb12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-socket-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-images\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f5aa9908-9832-4df9-b0b3-5c4e466fac2f-srv-cert\") pod \"olm-operator-6b444d44fb-b9hgf\" (UID: \"f5aa9908-9832-4df9-b0b3-5c4e466fac2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776819 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rckmd\" (UniqueName: \"kubernetes.io/projected/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-kube-api-access-rckmd\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2523fe-5501-417e-9d8b-85e936ed840c-config-volume\") pod \"collect-profiles-29494275-xzvdh\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.776936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c540dd44-ac44-44c6-8e9c-6e1d1890444e-default-certificate\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/605d630d-8166-4b8e-8594-80ed781b8b9d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hhlzr\" (UID: \"605d630d-8166-4b8e-8594-80ed781b8b9d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9tjzd\" (UID: \"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xph46\" (UniqueName: \"kubernetes.io/projected/74f4c17e-e02b-4971-9341-db1810cb5192-kube-api-access-xph46\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-proxy-tls\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-plugins-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/48412975-a793-44ee-a22c-dc7ad4145451-signing-key\") pod \"service-ca-9c57cc56f-7p8hx\" (UID: \"48412975-a793-44ee-a22c-dc7ad4145451\") " pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqdkk\" (UniqueName: \"kubernetes.io/projected/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-kube-api-access-gqdkk\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a45220-2ae6-4b62-80ae-8c82833390a4-config\") pod \"kube-apiserver-operator-766d6c64bb-srpcb\" (UID: \"32a45220-2ae6-4b62-80ae-8c82833390a4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa8df0a1-21ab-42b5-92fe-a444e09a0416-webhook-cert\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb-config\") pod \"kube-controller-manager-operator-78b949d7b-9tjzd\" (UID: \"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbjpd\" (UniqueName: \"kubernetes.io/projected/f5aa9908-9832-4df9-b0b3-5c4e466fac2f-kube-api-access-rbjpd\") pod \"olm-operator-6b444d44fb-b9hgf\" (UID: \"f5aa9908-9832-4df9-b0b3-5c4e466fac2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxjqw\" (UniqueName: \"kubernetes.io/projected/9194d298-b1b5-4b06-9254-b484dc1a1382-kube-api-access-jxjqw\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-node-bootstrap-token\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-dir\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/48412975-a793-44ee-a22c-dc7ad4145451-signing-cabundle\") pod \"service-ca-9c57cc56f-7p8hx\" (UID: \"48412975-a793-44ee-a22c-dc7ad4145451\") " pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/96781d63-5e82-4529-9051-c3b5c9a8175f-socket-dir\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfqj8\" (UniqueName: \"kubernetes.io/projected/94e68ae3-2da0-4fd9-a745-840760dd4efe-kube-api-access-zfqj8\") pod \"catalog-operator-68c6474976-57mk4\" (UID: \"94e68ae3-2da0-4fd9-a745-840760dd4efe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.778046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32a45220-2ae6-4b62-80ae-8c82833390a4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-srpcb\" (UID: \"32a45220-2ae6-4b62-80ae-8c82833390a4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.778097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.778161 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b20b69c3-e242-41ec-b726-c39c4338f7ba-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7v2mk\" (UID: \"b20b69c3-e242-41ec-b726-c39c4338f7ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.778246 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f5aa9908-9832-4df9-b0b3-5c4e466fac2f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-b9hgf\" (UID: \"f5aa9908-9832-4df9-b0b3-5c4e466fac2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.778298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-policies\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.778848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.779044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c540dd44-ac44-44c6-8e9c-6e1d1890444e-metrics-certs\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.779191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-dir\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.779704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c540dd44-ac44-44c6-8e9c-6e1d1890444e-service-ca-bundle\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.777189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aa8df0a1-21ab-42b5-92fe-a444e09a0416-tmpfs\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.779928 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a45220-2ae6-4b62-80ae-8c82833390a4-config\") pod \"kube-apiserver-operator-766d6c64bb-srpcb\" (UID: \"32a45220-2ae6-4b62-80ae-8c82833390a4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.780029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/daef8de5-6c32-460e-8830-884671338aca-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8bkh7\" (UID: \"daef8de5-6c32-460e-8830-884671338aca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.780425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32a45220-2ae6-4b62-80ae-8c82833390a4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-srpcb\" (UID: \"32a45220-2ae6-4b62-80ae-8c82833390a4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.780866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/605d630d-8166-4b8e-8594-80ed781b8b9d-proxy-tls\") pod \"machine-config-controller-84d6567774-hhlzr\" (UID: \"605d630d-8166-4b8e-8594-80ed781b8b9d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.781342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/605d630d-8166-4b8e-8594-80ed781b8b9d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hhlzr\" (UID: \"605d630d-8166-4b8e-8594-80ed781b8b9d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.783351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c540dd44-ac44-44c6-8e9c-6e1d1890444e-stats-auth\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.783403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c540dd44-ac44-44c6-8e9c-6e1d1890444e-default-certificate\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.783459 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.783759 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9tjzd\" (UID: \"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.785126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b20b69c3-e242-41ec-b726-c39c4338f7ba-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7v2mk\" (UID: \"b20b69c3-e242-41ec-b726-c39c4338f7ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.800030 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.806431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4afbbde5-74e6-4ce1-bcc1-8d517069e49b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f785m\" (UID: \"4afbbde5-74e6-4ce1-bcc1-8d517069e49b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.819064 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.838921 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.859179 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.867171 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.879895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:46 crc kubenswrapper[4707]: E0129 03:29:46.882143 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.382109379 +0000 UTC m=+140.866338284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.882527 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.890172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-images\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.899166 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.909707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.919351 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.924742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-proxy-tls\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.939726 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.940776 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.947576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2523fe-5501-417e-9d8b-85e936ed840c-secret-volume\") pod \"collect-profiles-29494275-xzvdh\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.947839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/94e68ae3-2da0-4fd9-a745-840760dd4efe-profile-collector-cert\") pod \"catalog-operator-68c6474976-57mk4\" (UID: \"94e68ae3-2da0-4fd9-a745-840760dd4efe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.954046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f5aa9908-9832-4df9-b0b3-5c4e466fac2f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-b9hgf\" (UID: \"f5aa9908-9832-4df9-b0b3-5c4e466fac2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.959671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.974680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f5aa9908-9832-4df9-b0b3-5c4e466fac2f-srv-cert\") pod \"olm-operator-6b444d44fb-b9hgf\" (UID: \"f5aa9908-9832-4df9-b0b3-5c4e466fac2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.978555 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 29 03:29:46 crc kubenswrapper[4707]: I0129 03:29:46.988067 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:46 crc kubenswrapper[4707]: E0129 03:29:46.988888 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.488866105 +0000 UTC m=+140.973095050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.000272 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.020801 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.045970 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.059374 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.060418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8275a2b-4124-46d8-b2f1-4a7e8401e369-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bzqlt\" (UID: \"e8275a2b-4124-46d8-b2f1-4a7e8401e369\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.064322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/94e68ae3-2da0-4fd9-a745-840760dd4efe-srv-cert\") pod \"catalog-operator-68c6474976-57mk4\" (UID: \"94e68ae3-2da0-4fd9-a745-840760dd4efe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.078467 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.090101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.090383 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.590337262 +0000 UTC m=+141.074566197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.090780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.091362 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.591348773 +0000 UTC m=+141.075577668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.099613 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.119767 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.130851 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8e31df-9566-48bf-9488-dfe398d5bb12-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jr7sf\" (UID: \"2d8e31df-9566-48bf-9488-dfe398d5bb12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.145491 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.156671 4707 request.go:700] Waited for 1.00497637s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-kube-scheduler-operator-config&limit=500&resourceVersion=0 Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.158500 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.170128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8e31df-9566-48bf-9488-dfe398d5bb12-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jr7sf\" (UID: \"2d8e31df-9566-48bf-9488-dfe398d5bb12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.178504 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.182831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa8df0a1-21ab-42b5-92fe-a444e09a0416-apiservice-cert\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.185059 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa8df0a1-21ab-42b5-92fe-a444e09a0416-webhook-cert\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.192264 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.193176 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.69315686 +0000 UTC m=+141.177385785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.198631 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.219036 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.237563 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.242752 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a06628-39ad-42db-8dec-259a64cc9947-config\") pod \"service-ca-operator-777779d784-qwgsz\" (UID: \"f7a06628-39ad-42db-8dec-259a64cc9947\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.257900 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.266336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7a06628-39ad-42db-8dec-259a64cc9947-serving-cert\") pod \"service-ca-operator-777779d784-qwgsz\" (UID: \"f7a06628-39ad-42db-8dec-259a64cc9947\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.278899 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.294660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.295175 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.795158373 +0000 UTC m=+141.279387289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.298710 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.318564 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.337825 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.349886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tpsmp\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.368694 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.374239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tpsmp\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.379264 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.397182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.397379 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.897349352 +0000 UTC m=+141.381578257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.397976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.398626 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:47.89859531 +0000 UTC m=+141.382824245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.399216 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.412234 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2523fe-5501-417e-9d8b-85e936ed840c-config-volume\") pod \"collect-profiles-29494275-xzvdh\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.413893 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kkzfz"] Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.413971 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn"] Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.419226 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-h88jj"] Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.420609 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 03:29:47 crc kubenswrapper[4707]: W0129 03:29:47.437081 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e542f9_e9bf_424e_9d2c_852baf887b17.slice/crio-59a4aa88b6ed83ad0478ad70a74434c02d7cfe9669c0d752ec3e5a18385ea3a7 WatchSource:0}: Error finding container 59a4aa88b6ed83ad0478ad70a74434c02d7cfe9669c0d752ec3e5a18385ea3a7: Status 404 returned error can't find the container with id 59a4aa88b6ed83ad0478ad70a74434c02d7cfe9669c0d752ec3e5a18385ea3a7 Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.438334 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.449652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" event={"ID":"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6","Type":"ContainerStarted","Data":"21f2b9ce302558eceab8cd0f502113af8af26d3a3d2a7bbbcdae7ebfb5a69718"} Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.451200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" event={"ID":"52c7f103-7152-484c-aff9-da45a3f8ac20","Type":"ContainerStarted","Data":"0e9cd463bb72d003bc01cb8d1d0c24659a06bb7a023c88545807e5397c1f424a"} Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.452485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" event={"ID":"69e542f9-e9bf-424e-9d2c-852baf887b17","Type":"ContainerStarted","Data":"59a4aa88b6ed83ad0478ad70a74434c02d7cfe9669c0d752ec3e5a18385ea3a7"} Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.459119 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.461391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/48412975-a793-44ee-a22c-dc7ad4145451-signing-cabundle\") pod \"service-ca-9c57cc56f-7p8hx\" (UID: \"48412975-a793-44ee-a22c-dc7ad4145451\") " pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.479710 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.491477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/48412975-a793-44ee-a22c-dc7ad4145451-signing-key\") pod \"service-ca-9c57cc56f-7p8hx\" (UID: \"48412975-a793-44ee-a22c-dc7ad4145451\") " pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.498220 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.500901 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.501098 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.001076207 +0000 UTC m=+141.485305112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.501508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.502064 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.002054627 +0000 UTC m=+141.486283532 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.518522 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.538636 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.554038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.558550 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.573733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.578578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.603859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.605928 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.105897375 +0000 UTC m=+141.590126290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.607798 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.613968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.632890 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.638450 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.642918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.644960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.657919 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.662582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.679314 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.685304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.702578 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.706660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.707459 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.207445165 +0000 UTC m=+141.691674080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.707529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.718365 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.719616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-policies\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.738565 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.758308 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.771360 4707 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.771491 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-cliconfig podName:9194d298-b1b5-4b06-9254-b484dc1a1382 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.271464678 +0000 UTC m=+141.755693583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-k6sdj" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382") : failed to sync configmap cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.771684 4707 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.771807 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-config-volume podName:4d9bdeda-5d83-4e63-bce4-527dd6aea51e nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.271774107 +0000 UTC m=+141.756003212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-config-volume") pod "dns-default-sfkbk" (UID: "4d9bdeda-5d83-4e63-bce4-527dd6aea51e") : failed to sync configmap cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.772431 4707 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.772457 4707 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.772521 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d39d4e48-a5a8-44ff-a1b3-9751f59bdffe-cert podName:d39d4e48-a5a8-44ff-a1b3-9751f59bdffe nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.272506199 +0000 UTC m=+141.756735274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d39d4e48-a5a8-44ff-a1b3-9751f59bdffe-cert") pod "ingress-canary-4lzss" (UID: "d39d4e48-a5a8-44ff-a1b3-9751f59bdffe") : failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.772630 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-metrics-tls podName:4d9bdeda-5d83-4e63-bce4-527dd6aea51e nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.272616932 +0000 UTC m=+141.756845837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-metrics-tls") pod "dns-default-sfkbk" (UID: "4d9bdeda-5d83-4e63-bce4-527dd6aea51e") : failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.778708 4707 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.778796 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-certs podName:be02bcec-7c1e-4c86-904d-41738c0af270 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.278774337 +0000 UTC m=+141.763003252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-certs") pod "machine-config-server-z6nwq" (UID: "be02bcec-7c1e-4c86-904d-41738c0af270") : failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.778713 4707 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.778850 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-service-ca podName:9194d298-b1b5-4b06-9254-b484dc1a1382 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.278842299 +0000 UTC m=+141.763071214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-service-ca") pod "oauth-openshift-558db77b4-k6sdj" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382") : failed to sync configmap cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.779502 4707 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.779524 4707 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.779581 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-node-bootstrap-token podName:be02bcec-7c1e-4c86-904d-41738c0af270 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.279572951 +0000 UTC m=+141.763801866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-node-bootstrap-token") pod "machine-config-server-z6nwq" (UID: "be02bcec-7c1e-4c86-904d-41738c0af270") : failed to sync secret cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.779613 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-trusted-ca-bundle podName:9194d298-b1b5-4b06-9254-b484dc1a1382 nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.279590022 +0000 UTC m=+141.763818937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-558db77b4-k6sdj" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382") : failed to sync configmap cache: timed out waiting for the condition Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.781199 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.798189 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.808173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.808652 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.308600423 +0000 UTC m=+141.792829368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.809720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.810252 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.310224492 +0000 UTC m=+141.794453477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.824900 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.837868 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.858740 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.878477 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.899606 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.911636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.911826 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.411786102 +0000 UTC m=+141.896015007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.912479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:47 crc kubenswrapper[4707]: E0129 03:29:47.912931 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.412923086 +0000 UTC m=+141.897151991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.919586 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.940043 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.958868 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.978601 4707 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 29 03:29:47 crc kubenswrapper[4707]: I0129 03:29:47.999515 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.014072 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.014323 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.51428755 +0000 UTC m=+141.998516495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.016427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.017263 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.517238859 +0000 UTC m=+142.001467804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.020009 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.039149 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.059267 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.079799 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.117773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.118072 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.618042766 +0000 UTC m=+142.102271711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.118837 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.119353 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.619333815 +0000 UTC m=+142.103562760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.130532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhsnv\" (UniqueName: \"kubernetes.io/projected/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-kube-api-access-dhsnv\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.149257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqvj8\" (UniqueName: \"kubernetes.io/projected/90329817-eb42-4c0b-8e57-908c60c1db50-kube-api-access-dqvj8\") pod \"authentication-operator-69f744f599-vt6tg\" (UID: \"90329817-eb42-4c0b-8e57-908c60c1db50\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.157092 4707 request.go:700] Waited for 1.909318919s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.169581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9f6d2ff-a38b-4d03-be71-90f5325aa1d4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-58jbp\" (UID: \"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.188516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfw2r\" (UniqueName: \"kubernetes.io/projected/cf7f47e6-91e0-4084-b586-69208aba0921-kube-api-access-vfw2r\") pod \"console-operator-58897d9998-h9hdt\" (UID: \"cf7f47e6-91e0-4084-b586-69208aba0921\") " pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.219885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.220218 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.720185653 +0000 UTC m=+142.204414568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.220768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.221193 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.721176133 +0000 UTC m=+142.205405048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.227422 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs72p\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-kube-api-access-hs72p\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.239280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78wp\" (UniqueName: \"kubernetes.io/projected/c193fefb-9fdb-479f-851b-4f1fd4c9d087-kube-api-access-m78wp\") pod \"downloads-7954f5f757-snpzw\" (UID: \"c193fefb-9fdb-479f-851b-4f1fd4c9d087\") " pod="openshift-console/downloads-7954f5f757-snpzw" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.262801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrmlw\" (UniqueName: \"kubernetes.io/projected/5251706a-66cb-47ad-a87f-047d3d252bd6-kube-api-access-vrmlw\") pod \"apiserver-7bbb656c7d-j2nxf\" (UID: \"5251706a-66cb-47ad-a87f-047d3d252bd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.281020 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.281764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhqvc\" (UniqueName: \"kubernetes.io/projected/2798ce2d-9125-464c-8001-03c3e9f65af7-kube-api-access-nhqvc\") pod \"openshift-apiserver-operator-796bbdcf4f-6t9xh\" (UID: \"2798ce2d-9125-464c-8001-03c3e9f65af7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.290727 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.308655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.310150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsthm\" (UniqueName: \"kubernetes.io/projected/2911ab30-6b8e-4e2f-8bb2-2d7270d16c33-kube-api-access-xsthm\") pod \"openshift-config-operator-7777fb866f-cbr69\" (UID: \"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.316896 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-snpzw" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.323657 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.323786 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.823765184 +0000 UTC m=+142.307994089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.323918 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.323989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.324062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-config-volume\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.324137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d39d4e48-a5a8-44ff-a1b3-9751f59bdffe-cert\") pod \"ingress-canary-4lzss\" (UID: \"d39d4e48-a5a8-44ff-a1b3-9751f59bdffe\") " pod="openshift-ingress-canary/ingress-canary-4lzss" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.324167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-metrics-tls\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.324192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.324264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-certs\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.324343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.324399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-node-bootstrap-token\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.324656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.325340 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-config-volume\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.326887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.327318 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-metrics-tls\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.327632 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.827599559 +0000 UTC m=+142.311828474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.327898 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.328558 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-certs\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.329005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/be02bcec-7c1e-4c86-904d-41738c0af270-node-bootstrap-token\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.330527 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d39d4e48-a5a8-44ff-a1b3-9751f59bdffe-cert\") pod \"ingress-canary-4lzss\" (UID: \"d39d4e48-a5a8-44ff-a1b3-9751f59bdffe\") " pod="openshift-ingress-canary/ingress-canary-4lzss" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.333021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djg29\" (UniqueName: \"kubernetes.io/projected/843bc0fd-4ef0-4f01-b1a1-b1281063a3dc-kube-api-access-djg29\") pod \"dns-operator-744455d44c-7h7rj\" (UID: \"843bc0fd-4ef0-4f01-b1a1-b1281063a3dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.334339 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.339604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-bound-sa-token\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.348342 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.356103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwsdj\" (UniqueName: \"kubernetes.io/projected/3bd3ec93-fe2c-4c05-a521-75bd0886f729-kube-api-access-bwsdj\") pod \"machine-approver-56656f9798-xsm6m\" (UID: \"3bd3ec93-fe2c-4c05-a521-75bd0886f729\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.372253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmqb7\" (UniqueName: \"kubernetes.io/projected/23df2202-fce8-4515-b147-1256fe6d953b-kube-api-access-vmqb7\") pod \"machine-api-operator-5694c8668f-mtt54\" (UID: \"23df2202-fce8-4515-b147-1256fe6d953b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.396953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6bpc\" (UniqueName: \"kubernetes.io/projected/09878060-fe42-40c3-b8b2-4392225b3669-kube-api-access-d6bpc\") pod \"cluster-samples-operator-665b6dd947-gwwcm\" (UID: \"09878060-fe42-40c3-b8b2-4392225b3669\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.423572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9z9j\" (UniqueName: \"kubernetes.io/projected/375c69a5-d957-4b05-a8b9-b241d63d52a6-kube-api-access-q9z9j\") pod \"etcd-operator-b45778765-rr7xr\" (UID: \"375c69a5-d957-4b05-a8b9-b241d63d52a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.428195 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.428724 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:48.928686285 +0000 UTC m=+142.412915190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.438967 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5fcn\" (UniqueName: \"kubernetes.io/projected/0980608d-c995-4e76-b03b-f486a390ab5b-kube-api-access-c5fcn\") pod \"kube-storage-version-migrator-operator-b67b599dd-khm86\" (UID: \"0980608d-c995-4e76-b03b-f486a390ab5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.439771 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.456174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsfwg\" (UniqueName: \"kubernetes.io/projected/53bfd3ca-7447-44cf-af4c-165db1f5e7be-kube-api-access-wsfwg\") pod \"console-f9d7485db-hp957\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.468883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" event={"ID":"69e542f9-e9bf-424e-9d2c-852baf887b17","Type":"ContainerStarted","Data":"9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443"} Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.469322 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.477615 4707 generic.go:334] "Generic (PLEG): container finished" podID="52c7f103-7152-484c-aff9-da45a3f8ac20" containerID="d95340b811672b7460659ed2fbd179e75d508fe567f89221e2e6eac5d5f0a310" exitCode=0 Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.477688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" event={"ID":"52c7f103-7152-484c-aff9-da45a3f8ac20","Type":"ContainerDied","Data":"d95340b811672b7460659ed2fbd179e75d508fe567f89221e2e6eac5d5f0a310"} Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.477689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d8e31df-9566-48bf-9488-dfe398d5bb12-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jr7sf\" (UID: \"2d8e31df-9566-48bf-9488-dfe398d5bb12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.479172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" event={"ID":"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6","Type":"ContainerStarted","Data":"a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052"} Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.480095 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.484821 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.492511 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.496957 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.498508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv924\" (UniqueName: \"kubernetes.io/projected/605d630d-8166-4b8e-8594-80ed781b8b9d-kube-api-access-jv924\") pod \"machine-config-controller-84d6567774-hhlzr\" (UID: \"605d630d-8166-4b8e-8594-80ed781b8b9d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.523924 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4xj6\" (UniqueName: \"kubernetes.io/projected/4afbbde5-74e6-4ce1-bcc1-8d517069e49b-kube-api-access-p4xj6\") pod \"package-server-manager-789f6589d5-f785m\" (UID: \"4afbbde5-74e6-4ce1-bcc1-8d517069e49b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.545399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.545831 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.045817142 +0000 UTC m=+142.530046047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.546250 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.556245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fdst\" (UniqueName: \"kubernetes.io/projected/96781d63-5e82-4529-9051-c3b5c9a8175f-kube-api-access-4fdst\") pod \"csi-hostpathplugin-7vfwn\" (UID: \"96781d63-5e82-4529-9051-c3b5c9a8175f\") " pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.560384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkfhf\" (UniqueName: \"kubernetes.io/projected/c540dd44-ac44-44c6-8e9c-6e1d1890444e-kube-api-access-wkfhf\") pod \"router-default-5444994796-f7lpb\" (UID: \"c540dd44-ac44-44c6-8e9c-6e1d1890444e\") " pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.570362 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.570429 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.576761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpbks\" (UniqueName: \"kubernetes.io/projected/be02bcec-7c1e-4c86-904d-41738c0af270-kube-api-access-wpbks\") pod \"machine-config-server-z6nwq\" (UID: \"be02bcec-7c1e-4c86-904d-41738c0af270\") " pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.578838 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-z6nwq" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.593651 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74f4c17e-e02b-4971-9341-db1810cb5192-bound-sa-token\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:48 crc kubenswrapper[4707]: W0129 03:29:48.595466 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bd3ec93_fe2c_4c05_a521_75bd0886f729.slice/crio-755162f129c9c8cdc9306884939a0c249a5f6da2364b13ed6c30a70089dcf0b3 WatchSource:0}: Error finding container 755162f129c9c8cdc9306884939a0c249a5f6da2364b13ed6c30a70089dcf0b3: Status 404 returned error can't find the container with id 755162f129c9c8cdc9306884939a0c249a5f6da2364b13ed6c30a70089dcf0b3 Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.605849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.613508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgxl5\" (UniqueName: \"kubernetes.io/projected/aa8df0a1-21ab-42b5-92fe-a444e09a0416-kube-api-access-cgxl5\") pod \"packageserver-d55dfcdfc-dqhzd\" (UID: \"aa8df0a1-21ab-42b5-92fe-a444e09a0416\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.627992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.642093 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.642156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs2n5\" (UniqueName: \"kubernetes.io/projected/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-kube-api-access-rs2n5\") pod \"marketplace-operator-79b997595-tpsmp\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.647222 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.647942 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.147919249 +0000 UTC m=+142.632148154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.648337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.651473 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.151458255 +0000 UTC m=+142.635687360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.662334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrlx\" (UniqueName: \"kubernetes.io/projected/48412975-a793-44ee-a22c-dc7ad4145451-kube-api-access-fxrlx\") pod \"service-ca-9c57cc56f-7p8hx\" (UID: \"48412975-a793-44ee-a22c-dc7ad4145451\") " pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.690690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlxkj\" (UniqueName: \"kubernetes.io/projected/daef8de5-6c32-460e-8830-884671338aca-kube-api-access-tlxkj\") pod \"multus-admission-controller-857f4d67dd-8bkh7\" (UID: \"daef8de5-6c32-460e-8830-884671338aca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.698884 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.700196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bqsv\" (UniqueName: \"kubernetes.io/projected/b20b69c3-e242-41ec-b726-c39c4338f7ba-kube-api-access-9bqsv\") pod \"openshift-controller-manager-operator-756b6f6bc6-7v2mk\" (UID: \"b20b69c3-e242-41ec-b726-c39c4338f7ba\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.706026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.722610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm9lt\" (UniqueName: \"kubernetes.io/projected/e8275a2b-4124-46d8-b2f1-4a7e8401e369-kube-api-access-rm9lt\") pod \"control-plane-machine-set-operator-78cbb6b69f-bzqlt\" (UID: \"e8275a2b-4124-46d8-b2f1-4a7e8401e369\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.724031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.724356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.742198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr2jk\" (UniqueName: \"kubernetes.io/projected/d39d4e48-a5a8-44ff-a1b3-9751f59bdffe-kube-api-access-gr2jk\") pod \"ingress-canary-4lzss\" (UID: \"d39d4e48-a5a8-44ff-a1b3-9751f59bdffe\") " pod="openshift-ingress-canary/ingress-canary-4lzss" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.754381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.754857 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.254825029 +0000 UTC m=+142.739053934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.754994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.755580 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.755594 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.255565161 +0000 UTC m=+142.739794066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.758598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5hzm\" (UniqueName: \"kubernetes.io/projected/aa6f96d8-64db-4d31-b07f-a933cca2d98f-kube-api-access-n5hzm\") pod \"migrator-59844c95c7-fzdml\" (UID: \"aa6f96d8-64db-4d31-b07f-a933cca2d98f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.761970 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.775242 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.782625 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.788335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9ndf\" (UniqueName: \"kubernetes.io/projected/8a2523fe-5501-417e-9d8b-85e936ed840c-kube-api-access-m9ndf\") pod \"collect-profiles-29494275-xzvdh\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.796739 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.805411 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.813055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.815334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbjpd\" (UniqueName: \"kubernetes.io/projected/f5aa9908-9832-4df9-b0b3-5c4e466fac2f-kube-api-access-rbjpd\") pod \"olm-operator-6b444d44fb-b9hgf\" (UID: \"f5aa9908-9832-4df9-b0b3-5c4e466fac2f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.831346 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-snpzw"] Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.832294 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4lzss" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.840073 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf"] Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.843172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rckmd\" (UniqueName: \"kubernetes.io/projected/4d9bdeda-5d83-4e63-bce4-527dd6aea51e-kube-api-access-rckmd\") pod \"dns-default-sfkbk\" (UID: \"4d9bdeda-5d83-4e63-bce4-527dd6aea51e\") " pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.850043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfqj8\" (UniqueName: \"kubernetes.io/projected/94e68ae3-2da0-4fd9-a745-840760dd4efe-kube-api-access-zfqj8\") pod \"catalog-operator-68c6474976-57mk4\" (UID: \"94e68ae3-2da0-4fd9-a745-840760dd4efe\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.857160 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.857897 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.357881524 +0000 UTC m=+142.842110419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.865463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxjqw\" (UniqueName: \"kubernetes.io/projected/9194d298-b1b5-4b06-9254-b484dc1a1382-kube-api-access-jxjqw\") pod \"oauth-openshift-558db77b4-k6sdj\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.886271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqdkk\" (UniqueName: \"kubernetes.io/projected/ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007-kube-api-access-gqdkk\") pod \"machine-config-operator-74547568cd-pgx7m\" (UID: \"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.897817 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgld6\" (UniqueName: \"kubernetes.io/projected/f7a06628-39ad-42db-8dec-259a64cc9947-kube-api-access-sgld6\") pod \"service-ca-operator-777779d784-qwgsz\" (UID: \"f7a06628-39ad-42db-8dec-259a64cc9947\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.915486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32a45220-2ae6-4b62-80ae-8c82833390a4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-srpcb\" (UID: \"32a45220-2ae6-4b62-80ae-8c82833390a4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.927453 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7h7rj"] Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.935839 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vt6tg"] Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.937093 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h9hdt"] Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.937247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xph46\" (UniqueName: \"kubernetes.io/projected/74f4c17e-e02b-4971-9341-db1810cb5192-kube-api-access-xph46\") pod \"ingress-operator-5b745b69d9-s6tr9\" (UID: \"74f4c17e-e02b-4971-9341-db1810cb5192\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.958137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9tjzd\" (UID: \"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.958956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:48 crc kubenswrapper[4707]: E0129 03:29:48.959263 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.459249668 +0000 UTC m=+142.943478573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.971002 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.975165 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.983021 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" Jan 29 03:29:48 crc kubenswrapper[4707]: I0129 03:29:48.989020 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" Jan 29 03:29:49 crc kubenswrapper[4707]: W0129 03:29:49.002126 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod843bc0fd_4ef0_4f01_b1a1_b1281063a3dc.slice/crio-e96894b8311fcfafa3ec5c3db551a7482d574fa31205b869e266baad3b2bcaf7 WatchSource:0}: Error finding container e96894b8311fcfafa3ec5c3db551a7482d574fa31205b869e266baad3b2bcaf7: Status 404 returned error can't find the container with id e96894b8311fcfafa3ec5c3db551a7482d574fa31205b869e266baad3b2bcaf7 Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.030440 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.040344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.054758 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml" Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.069988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:49 crc kubenswrapper[4707]: E0129 03:29:49.070322 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.570307684 +0000 UTC m=+143.054536589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.070341 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.093653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.111557 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.124437 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.125958 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.127896 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mtt54"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.140641 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.172620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:49 crc kubenswrapper[4707]: E0129 03:29:49.173440 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.6734268 +0000 UTC m=+143.157655705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.274410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:49 crc kubenswrapper[4707]: E0129 03:29:49.274753 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.774734023 +0000 UTC m=+143.258962928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.275436 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rr7xr"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.300252 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cbr69"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.306341 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7vfwn"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.312498 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.381811 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:49 crc kubenswrapper[4707]: E0129 03:29:49.382188 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.882175919 +0000 UTC m=+143.366404824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.403144 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.463906 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.482838 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:49 crc kubenswrapper[4707]: E0129 03:29:49.487652 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:49.987629996 +0000 UTC m=+143.471858901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.545090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-snpzw" event={"ID":"c193fefb-9fdb-479f-851b-4f1fd4c9d087","Type":"ContainerStarted","Data":"1f7b31a9c6ed44eaede739276695d80e9bba21450c021abe02a97d037e0e547d"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.589169 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:49 crc kubenswrapper[4707]: E0129 03:29:49.589506 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:50.089496116 +0000 UTC m=+143.573725021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.597514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" event={"ID":"52c7f103-7152-484c-aff9-da45a3f8ac20","Type":"ContainerStarted","Data":"454132aa4d8120fc209700c4ea7f3c70017e89a0d71d75977db41fb350c7ec21"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.614112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" event={"ID":"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4","Type":"ContainerStarted","Data":"d082b57737235f249bf41aa0d1466a30e48a151b08cd603dacb1e038bd09dac1"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.639577 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tpsmp"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.678897 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8bkh7"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.690375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:49 crc kubenswrapper[4707]: E0129 03:29:49.690734 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:50.190713885 +0000 UTC m=+143.674942790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.716616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" event={"ID":"3bd3ec93-fe2c-4c05-a521-75bd0886f729","Type":"ContainerStarted","Data":"a6242790694a87308b27435ed8ce3e80771b280e556bf37a3b3ce0807660c57e"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.716682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" event={"ID":"3bd3ec93-fe2c-4c05-a521-75bd0886f729","Type":"ContainerStarted","Data":"755162f129c9c8cdc9306884939a0c249a5f6da2364b13ed6c30a70089dcf0b3"} Jan 29 03:29:49 crc kubenswrapper[4707]: W0129 03:29:49.735650 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4afbbde5_74e6_4ce1_bcc1_8d517069e49b.slice/crio-d7cba7472474e47aceffd759cb00aa0853f66212bbd206f38d38b3d00d090300 WatchSource:0}: Error finding container d7cba7472474e47aceffd759cb00aa0853f66212bbd206f38d38b3d00d090300: Status 404 returned error can't find the container with id d7cba7472474e47aceffd759cb00aa0853f66212bbd206f38d38b3d00d090300 Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.746896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h9hdt" event={"ID":"cf7f47e6-91e0-4084-b586-69208aba0921","Type":"ContainerStarted","Data":"6ed334c5090cdd70fe6c321e2640ddc642c9e689b1ae493721da1560105105a2"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.754692 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" event={"ID":"23df2202-fce8-4515-b147-1256fe6d953b","Type":"ContainerStarted","Data":"abf41e8f883f0d83f36ce0bb44e08fd054778db02a7387dedffa7223189eb66c"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.764063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" event={"ID":"90329817-eb42-4c0b-8e57-908c60c1db50","Type":"ContainerStarted","Data":"e50317dc7a8c95ec348b4d68ee0f55cf6dc4ddff81080c63a130b8db36d1dc15"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.783878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-z6nwq" event={"ID":"be02bcec-7c1e-4c86-904d-41738c0af270","Type":"ContainerStarted","Data":"a2bd4f7bba958f1c1e6b92a354fc81ed035ebecb89f6a39c615835a2c89be7a9"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.783949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-z6nwq" event={"ID":"be02bcec-7c1e-4c86-904d-41738c0af270","Type":"ContainerStarted","Data":"b96f9f0dc2f89975f697af963037549704f999188b8f74b8c9d359c9a5a39a3b"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.823464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.828240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" event={"ID":"5251706a-66cb-47ad-a87f-047d3d252bd6","Type":"ContainerStarted","Data":"44a457f8c7442c1fe8c70aed4bdb87da5130d0d8b268aed28891473d517ab373"} Jan 29 03:29:49 crc kubenswrapper[4707]: E0129 03:29:49.830064 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:50.33004937 +0000 UTC m=+143.814278275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.838283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" event={"ID":"843bc0fd-4ef0-4f01-b1a1-b1281063a3dc","Type":"ContainerStarted","Data":"e96894b8311fcfafa3ec5c3db551a7482d574fa31205b869e266baad3b2bcaf7"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.844043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f7lpb" event={"ID":"c540dd44-ac44-44c6-8e9c-6e1d1890444e","Type":"ContainerStarted","Data":"ea7e9ebeb09ca7e3bf8e39aa80ecf64b6e4d1ff91947b317ed2a0df1137da907"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.886758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" event={"ID":"375c69a5-d957-4b05-a8b9-b241d63d52a6","Type":"ContainerStarted","Data":"0891c25a438e3fead5893cacf89452d5a3e9f6df9c817003cd061ba3afddee74"} Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.895877 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.914287 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.931823 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:49 crc kubenswrapper[4707]: E0129 03:29:49.933708 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:50.433686452 +0000 UTC m=+143.917915357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.950385 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt"] Jan 29 03:29:49 crc kubenswrapper[4707]: I0129 03:29:49.968993 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk"] Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.034603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.035712 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:50.535699685 +0000 UTC m=+144.019928590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.089594 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" podStartSLOduration=118.089524051 podStartE2EDuration="1m58.089524051s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:50.080610503 +0000 UTC m=+143.564839408" watchObservedRunningTime="2026-01-29 03:29:50.089524051 +0000 UTC m=+143.573752956" Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.136202 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.136710 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:50.636691098 +0000 UTC m=+144.120920003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.242032 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.244194 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:50.744179096 +0000 UTC m=+144.228408001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: W0129 03:29:50.326402 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod605d630d_8166_4b8e_8594_80ed781b8b9d.slice/crio-01304a60b9aeca7b821f04efb8f73491ef7418a895968b73206f8776e11f92cb WatchSource:0}: Error finding container 01304a60b9aeca7b821f04efb8f73491ef7418a895968b73206f8776e11f92cb: Status 404 returned error can't find the container with id 01304a60b9aeca7b821f04efb8f73491ef7418a895968b73206f8776e11f92cb Jan 29 03:29:50 crc kubenswrapper[4707]: W0129 03:29:50.339265 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8275a2b_4124_46d8_b2f1_4a7e8401e369.slice/crio-aea4671ff684fcfd123933880f745ff518611a4b8e5fb62b867f89fa294e6175 WatchSource:0}: Error finding container aea4671ff684fcfd123933880f745ff518611a4b8e5fb62b867f89fa294e6175: Status 404 returned error can't find the container with id aea4671ff684fcfd123933880f745ff518611a4b8e5fb62b867f89fa294e6175 Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.343786 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.344184 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:50.844164278 +0000 UTC m=+144.328393183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: W0129 03:29:50.346263 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8e31df_9566_48bf_9488_dfe398d5bb12.slice/crio-a2c951343835cc4c6571a4359ce40e16ec6a849f0957e2628be77741f0cba9a5 WatchSource:0}: Error finding container a2c951343835cc4c6571a4359ce40e16ec6a849f0957e2628be77741f0cba9a5: Status 404 returned error can't find the container with id a2c951343835cc4c6571a4359ce40e16ec6a849f0957e2628be77741f0cba9a5 Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.349795 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd"] Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.403200 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hp957"] Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.445346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.446038 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:50.946024357 +0000 UTC m=+144.430253262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.447273 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7p8hx"] Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.547585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.547759 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.047731552 +0000 UTC m=+144.531960457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.547877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.548232 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.048219736 +0000 UTC m=+144.532448641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.554654 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" podStartSLOduration=118.554638449 podStartE2EDuration="1m58.554638449s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:50.552718421 +0000 UTC m=+144.036947326" watchObservedRunningTime="2026-01-29 03:29:50.554638449 +0000 UTC m=+144.038867354" Jan 29 03:29:50 crc kubenswrapper[4707]: W0129 03:29:50.556134 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa8df0a1_21ab_42b5_92fe_a444e09a0416.slice/crio-045922d4c45461e09c77ba4a024b5a408ee082cb0d15284ae45fdc21b80fcd4d WatchSource:0}: Error finding container 045922d4c45461e09c77ba4a024b5a408ee082cb0d15284ae45fdc21b80fcd4d: Status 404 returned error can't find the container with id 045922d4c45461e09c77ba4a024b5a408ee082cb0d15284ae45fdc21b80fcd4d Jan 29 03:29:50 crc kubenswrapper[4707]: W0129 03:29:50.559165 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53bfd3ca_7447_44cf_af4c_165db1f5e7be.slice/crio-1c662ee3a3d13614db59cd89e1ff86b9243dbc4c17240534528ff6d9614b3cd8 WatchSource:0}: Error finding container 1c662ee3a3d13614db59cd89e1ff86b9243dbc4c17240534528ff6d9614b3cd8: Status 404 returned error can't find the container with id 1c662ee3a3d13614db59cd89e1ff86b9243dbc4c17240534528ff6d9614b3cd8 Jan 29 03:29:50 crc kubenswrapper[4707]: W0129 03:29:50.600111 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48412975_a793_44ee_a22c_dc7ad4145451.slice/crio-8409aeeed5bf24621ebe5c7794a4bc476db5a3dd9bdbf430794060b2380999cf WatchSource:0}: Error finding container 8409aeeed5bf24621ebe5c7794a4bc476db5a3dd9bdbf430794060b2380999cf: Status 404 returned error can't find the container with id 8409aeeed5bf24621ebe5c7794a4bc476db5a3dd9bdbf430794060b2380999cf Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.657904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.658447 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.158430156 +0000 UTC m=+144.642659061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.701529 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.760412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.761010 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.260988336 +0000 UTC m=+144.745217241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.783071 4707 patch_prober.go:28] interesting pod/router-default-5444994796-f7lpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 03:29:50 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 29 03:29:50 crc kubenswrapper[4707]: [+]process-running ok Jan 29 03:29:50 crc kubenswrapper[4707]: healthz check failed Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.783117 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f7lpb" podUID="c540dd44-ac44-44c6-8e9c-6e1d1890444e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.862288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.862807 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.362779203 +0000 UTC m=+144.847008118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.863245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.863679 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.3636668 +0000 UTC m=+144.847895705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.921591 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-f7lpb" podStartSLOduration=118.921569919 podStartE2EDuration="1m58.921569919s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:50.901944869 +0000 UTC m=+144.386173764" watchObservedRunningTime="2026-01-29 03:29:50.921569919 +0000 UTC m=+144.405798814" Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.937298 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m"] Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.941715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" event={"ID":"605d630d-8166-4b8e-8594-80ed781b8b9d","Type":"ContainerStarted","Data":"01304a60b9aeca7b821f04efb8f73491ef7418a895968b73206f8776e11f92cb"} Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.960324 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml"] Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.964186 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:50 crc kubenswrapper[4707]: E0129 03:29:50.964813 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.464791717 +0000 UTC m=+144.949020622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:50 crc kubenswrapper[4707]: I0129 03:29:50.975442 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-z6nwq" podStartSLOduration=4.975405745 podStartE2EDuration="4.975405745s" podCreationTimestamp="2026-01-29 03:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:50.937145006 +0000 UTC m=+144.421373911" watchObservedRunningTime="2026-01-29 03:29:50.975405745 +0000 UTC m=+144.459634650" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.045478 4707 generic.go:334] "Generic (PLEG): container finished" podID="5251706a-66cb-47ad-a87f-047d3d252bd6" containerID="977be8044f20d18a96d94c8fb456d6a54f1ea0a689dcb869f16d9f5a948c848d" exitCode=0 Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.045617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" event={"ID":"5251706a-66cb-47ad-a87f-047d3d252bd6","Type":"ContainerDied","Data":"977be8044f20d18a96d94c8fb456d6a54f1ea0a689dcb869f16d9f5a948c848d"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.067028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" event={"ID":"2d8e31df-9566-48bf-9488-dfe398d5bb12","Type":"ContainerStarted","Data":"a2c951343835cc4c6571a4359ce40e16ec6a849f0957e2628be77741f0cba9a5"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.067403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.067861 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.567844551 +0000 UTC m=+145.052073456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.092579 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh"] Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.111868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" event={"ID":"e8275a2b-4124-46d8-b2f1-4a7e8401e369","Type":"ContainerStarted","Data":"aea4671ff684fcfd123933880f745ff518611a4b8e5fb62b867f89fa294e6175"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.141294 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4lzss"] Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.145252 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd"] Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.145806 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" event={"ID":"eeef9237-bd0b-494d-a3a0-8b6e54baa03e","Type":"ContainerStarted","Data":"da18ab320b74de2f8687ed0e9e014d3db3c41c3a29228cf55d1003b8bff24c3e"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.150795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" event={"ID":"aa8df0a1-21ab-42b5-92fe-a444e09a0416","Type":"ContainerStarted","Data":"045922d4c45461e09c77ba4a024b5a408ee082cb0d15284ae45fdc21b80fcd4d"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.152308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" event={"ID":"2798ce2d-9125-464c-8001-03c3e9f65af7","Type":"ContainerStarted","Data":"fef3c052eb3856ab8e7a102c9d87b602c78ec57ecd4fa0ccecf2419eabed8c95"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.154439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" event={"ID":"0980608d-c995-4e76-b03b-f486a390ab5b","Type":"ContainerStarted","Data":"9a93310b117ff2b1961039406c9b806db0f294ae6edf4b7c154b97254a197fee"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.168610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.170724 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.67070801 +0000 UTC m=+145.154936915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.175943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f7lpb" event={"ID":"c540dd44-ac44-44c6-8e9c-6e1d1890444e","Type":"ContainerStarted","Data":"33396d4f5fcd8cf2d8cb73210fbbed3da953fa958c2a39d2a3efe515aa2f2e05"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.183796 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" podStartSLOduration=119.183779373 podStartE2EDuration="1m59.183779373s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:51.182454003 +0000 UTC m=+144.666682908" watchObservedRunningTime="2026-01-29 03:29:51.183779373 +0000 UTC m=+144.668008278" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.223458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb"] Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.249471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" event={"ID":"09878060-fe42-40c3-b8b2-4392225b3669","Type":"ContainerStarted","Data":"1a6d4ea7b99dbfbf3606daf0379cb05d65fea1515f23c4af25baf13cdef07709"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.271165 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-snpzw" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.271191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-snpzw" event={"ID":"c193fefb-9fdb-479f-851b-4f1fd4c9d087","Type":"ContainerStarted","Data":"f8e2acfcc6daf50a1b03c87e34987086ea77affd366a244f4096fd3fec8dd71f"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.271211 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9"] Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.271394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.295924 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-snpzw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.295981 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-snpzw" podUID="c193fefb-9fdb-479f-851b-4f1fd4c9d087" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.299453 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf"] Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.299476 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.799460277 +0000 UTC m=+145.283689182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: W0129 03:29:51.302443 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd39d4e48_a5a8_44ff_a1b3_9751f59bdffe.slice/crio-d3ed6a77725c6bb9e36bf8a752f23893a8149a4a90e9ad8555ad27c53ec00516 WatchSource:0}: Error finding container d3ed6a77725c6bb9e36bf8a752f23893a8149a4a90e9ad8555ad27c53ec00516: Status 404 returned error can't find the container with id d3ed6a77725c6bb9e36bf8a752f23893a8149a4a90e9ad8555ad27c53ec00516 Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.314914 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-snpzw" podStartSLOduration=119.314892951 podStartE2EDuration="1m59.314892951s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:51.309647943 +0000 UTC m=+144.793876848" watchObservedRunningTime="2026-01-29 03:29:51.314892951 +0000 UTC m=+144.799121856" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.332431 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" event={"ID":"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33","Type":"ContainerStarted","Data":"8231bcee6cfea34b6cb76ad744597b34d61d02e1ac5003a946df6c98071e8994"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.338411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" event={"ID":"4afbbde5-74e6-4ce1-bcc1-8d517069e49b","Type":"ContainerStarted","Data":"d7cba7472474e47aceffd759cb00aa0853f66212bbd206f38d38b3d00d090300"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.339400 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" event={"ID":"48412975-a793-44ee-a22c-dc7ad4145451","Type":"ContainerStarted","Data":"8409aeeed5bf24621ebe5c7794a4bc476db5a3dd9bdbf430794060b2380999cf"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.361248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" event={"ID":"96781d63-5e82-4529-9051-c3b5c9a8175f","Type":"ContainerStarted","Data":"d334d909295633be5a885d1ab5c9308a923983df1c9cc8376f3f97800cf0c5c9"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.369481 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4"] Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.372463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.374449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" event={"ID":"b20b69c3-e242-41ec-b726-c39c4338f7ba","Type":"ContainerStarted","Data":"619241b778d122c9c50dcc885005fc5551cbd220822cdadedc48a02958b28581"} Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.374579 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.874527921 +0000 UTC m=+145.358756826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.381182 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" event={"ID":"daef8de5-6c32-460e-8830-884671338aca","Type":"ContainerStarted","Data":"01b9e017b9665c15b027329703cee470b13a3cca2af3f65607e2544a4110d6db"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.387643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hp957" event={"ID":"53bfd3ca-7447-44cf-af4c-165db1f5e7be","Type":"ContainerStarted","Data":"1c662ee3a3d13614db59cd89e1ff86b9243dbc4c17240534528ff6d9614b3cd8"} Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.400512 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sfkbk"] Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.401311 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" podStartSLOduration=119.401290345 podStartE2EDuration="1m59.401290345s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:51.3838025 +0000 UTC m=+144.868031405" watchObservedRunningTime="2026-01-29 03:29:51.401290345 +0000 UTC m=+144.885519250" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.413380 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz"] Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.416390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k6sdj"] Jan 29 03:29:51 crc kubenswrapper[4707]: W0129 03:29:51.442995 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9194d298_b1b5_4b06_9254_b484dc1a1382.slice/crio-c5234fe324856261d88262adb19a7dc37223978ecf3275ecd3306a8e6085f70a WatchSource:0}: Error finding container c5234fe324856261d88262adb19a7dc37223978ecf3275ecd3306a8e6085f70a: Status 404 returned error can't find the container with id c5234fe324856261d88262adb19a7dc37223978ecf3275ecd3306a8e6085f70a Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.476699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.477713 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:51.97769668 +0000 UTC m=+145.461925585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.578425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.579320 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.079299251 +0000 UTC m=+145.563528156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.680991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.681309 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.181295434 +0000 UTC m=+145.665524329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.707728 4707 patch_prober.go:28] interesting pod/router-default-5444994796-f7lpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 03:29:51 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 29 03:29:51 crc kubenswrapper[4707]: [+]process-running ok Jan 29 03:29:51 crc kubenswrapper[4707]: healthz check failed Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.707785 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f7lpb" podUID="c540dd44-ac44-44c6-8e9c-6e1d1890444e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.782757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.783167 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.283153403 +0000 UTC m=+145.767382308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.885103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.885565 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.385529747 +0000 UTC m=+145.869758652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.940167 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.940242 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.986149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.986365 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.486337135 +0000 UTC m=+145.970566050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:51 crc kubenswrapper[4707]: I0129 03:29:51.986531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:51 crc kubenswrapper[4707]: E0129 03:29:51.986908 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.486893502 +0000 UTC m=+145.971122417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.019971 4707 patch_prober.go:28] interesting pod/apiserver-76f77b778f-kkzfz container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]log ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]etcd ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/generic-apiserver-start-informers ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/max-in-flight-filter ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 29 03:29:52 crc kubenswrapper[4707]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 29 03:29:52 crc kubenswrapper[4707]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/project.openshift.io-projectcache ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-startinformers ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 29 03:29:52 crc kubenswrapper[4707]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 29 03:29:52 crc kubenswrapper[4707]: livez check failed Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.020040 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" podUID="52c7f103-7152-484c-aff9-da45a3f8ac20" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.087571 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.087978 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.587945356 +0000 UTC m=+146.072174261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.088104 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.088690 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.588670688 +0000 UTC m=+146.072899603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.189775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.190086 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.690037822 +0000 UTC m=+146.174266777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.190399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.190898 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.690877118 +0000 UTC m=+146.175106033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.291778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.292171 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.792126158 +0000 UTC m=+146.276355103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.292268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.292781 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.792763567 +0000 UTC m=+146.276992512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.394126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.394911 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.894884564 +0000 UTC m=+146.379113469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.448106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" event={"ID":"a9f6d2ff-a38b-4d03-be71-90f5325aa1d4","Type":"ContainerStarted","Data":"992c2950c0eec7aee6ad7aa9abd9afaccda211b7c933cd5324958994c3180c72"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.454888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" event={"ID":"f7a06628-39ad-42db-8dec-259a64cc9947","Type":"ContainerStarted","Data":"ebfe413316191a4f02519875507910627e00560629dcac58bc348b3863349734"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.461483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" event={"ID":"e8275a2b-4124-46d8-b2f1-4a7e8401e369","Type":"ContainerStarted","Data":"45260274cbba4aab680bda2ab7004cbee805a73b3b388aee31d7468aa38b8408"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.478478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" event={"ID":"74f4c17e-e02b-4971-9341-db1810cb5192","Type":"ContainerStarted","Data":"b37d93e2facfd5bca6d75b36420d678a8cc21042e71551cd9660deb736d77446"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.481372 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" event={"ID":"3bd3ec93-fe2c-4c05-a521-75bd0886f729","Type":"ContainerStarted","Data":"d3bd9d792d0d041d67fe7a7647ebb401e6ceb9da5e867b1400e9a64b86b158c1"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.485954 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-58jbp" podStartSLOduration=120.485941159 podStartE2EDuration="2m0.485941159s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.469098803 +0000 UTC m=+145.953327708" watchObservedRunningTime="2026-01-29 03:29:52.485941159 +0000 UTC m=+145.970170064" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.489088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h9hdt" event={"ID":"cf7f47e6-91e0-4084-b586-69208aba0921","Type":"ContainerStarted","Data":"6d18b0e7cc729919ff85f5d63f3675a70303daf14a0411bac930af0e136625be"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.490141 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.492026 4707 patch_prober.go:28] interesting pod/console-operator-58897d9998-h9hdt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.492060 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h9hdt" podUID="cf7f47e6-91e0-4084-b586-69208aba0921" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.492931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" event={"ID":"375c69a5-d957-4b05-a8b9-b241d63d52a6","Type":"ContainerStarted","Data":"d69f6fdaad404dd13632885dd55effcdf822b3f0edcac4e3181f822a3ededfff"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.496329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.496685 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:52.996670371 +0000 UTC m=+146.480899276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.510804 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bzqlt" podStartSLOduration=120.510787405 podStartE2EDuration="2m0.510787405s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.508352072 +0000 UTC m=+145.992580977" watchObservedRunningTime="2026-01-29 03:29:52.510787405 +0000 UTC m=+145.995016300" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.511147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" event={"ID":"0980608d-c995-4e76-b03b-f486a390ab5b","Type":"ContainerStarted","Data":"57a1c0463155debca2e2611ee6c51fd607d0b6b79b398082fbd6c76726d72a84"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.524243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" event={"ID":"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007","Type":"ContainerStarted","Data":"c08249a4c72aceb8977799a856ecd42aadfb01bf2db941fad37a8a53a8983cc3"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.557219 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rr7xr" podStartSLOduration=120.557195529 podStartE2EDuration="2m0.557195529s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.534304141 +0000 UTC m=+146.018533046" watchObservedRunningTime="2026-01-29 03:29:52.557195529 +0000 UTC m=+146.041424434" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.559693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" event={"ID":"eeef9237-bd0b-494d-a3a0-8b6e54baa03e","Type":"ContainerStarted","Data":"7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.559894 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.567011 4707 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tpsmp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.567065 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" podUID="eeef9237-bd0b-494d-a3a0-8b6e54baa03e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.573700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4lzss" event={"ID":"d39d4e48-a5a8-44ff-a1b3-9751f59bdffe","Type":"ContainerStarted","Data":"ba3f3e874819d1197310a894204449e80bd9ba512da0e51f39c7415f6f6c05e8"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.573757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4lzss" event={"ID":"d39d4e48-a5a8-44ff-a1b3-9751f59bdffe","Type":"ContainerStarted","Data":"d3ed6a77725c6bb9e36bf8a752f23893a8149a4a90e9ad8555ad27c53ec00516"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.600906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" event={"ID":"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb","Type":"ContainerStarted","Data":"d205a997e229b2c9bfc5eb272ecea12012f90a1bf3fa578603b47e86a5354a06"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.608396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.609634 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-h9hdt" podStartSLOduration=120.609621503 podStartE2EDuration="2m0.609621503s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.56057056 +0000 UTC m=+146.044799465" watchObservedRunningTime="2026-01-29 03:29:52.609621503 +0000 UTC m=+146.093850398" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.621491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" event={"ID":"9194d298-b1b5-4b06-9254-b484dc1a1382","Type":"ContainerStarted","Data":"c5234fe324856261d88262adb19a7dc37223978ecf3275ecd3306a8e6085f70a"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.622376 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.626622 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4lzss" podStartSLOduration=6.626595463 podStartE2EDuration="6.626595463s" podCreationTimestamp="2026-01-29 03:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.623292974 +0000 UTC m=+146.107521899" watchObservedRunningTime="2026-01-29 03:29:52.626595463 +0000 UTC m=+146.110824388" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.628425 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" podStartSLOduration=120.628408027 podStartE2EDuration="2m0.628408027s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.583621032 +0000 UTC m=+146.067849947" watchObservedRunningTime="2026-01-29 03:29:52.628408027 +0000 UTC m=+146.112636932" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.628920 4707 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-k6sdj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.39:6443/healthz\": dial tcp 10.217.0.39:6443: connect: connection refused" start-of-body= Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.629026 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.39:6443/healthz\": dial tcp 10.217.0.39:6443: connect: connection refused" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.633769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" event={"ID":"f5aa9908-9832-4df9-b0b3-5c4e466fac2f","Type":"ContainerStarted","Data":"50fa843a6d0817a8d9bd27aabd496277520de6f3250ee4fa72d053c845a55be3"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.633817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" event={"ID":"f5aa9908-9832-4df9-b0b3-5c4e466fac2f","Type":"ContainerStarted","Data":"191511c74dfb69a567663819d155ac680067ca15333afd56aedc4a5f46fa52b2"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.634453 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.635900 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.135877862 +0000 UTC m=+146.620106767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.636002 4707 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-b9hgf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.636120 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" podUID="f5aa9908-9832-4df9-b0b3-5c4e466fac2f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.645905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sfkbk" event={"ID":"4d9bdeda-5d83-4e63-bce4-527dd6aea51e","Type":"ContainerStarted","Data":"615ec045b6486ec3dee77a25db1f8c8615869c50c84273bdb55df0e937644b3c"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.667655 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" podStartSLOduration=120.667613385 podStartE2EDuration="2m0.667613385s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.653395848 +0000 UTC m=+146.137624753" watchObservedRunningTime="2026-01-29 03:29:52.667613385 +0000 UTC m=+146.151842310" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.707754 4707 patch_prober.go:28] interesting pod/router-default-5444994796-f7lpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 03:29:52 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 29 03:29:52 crc kubenswrapper[4707]: [+]process-running ok Jan 29 03:29:52 crc kubenswrapper[4707]: healthz check failed Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.708070 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f7lpb" podUID="c540dd44-ac44-44c6-8e9c-6e1d1890444e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.708986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" event={"ID":"605d630d-8166-4b8e-8594-80ed781b8b9d","Type":"ContainerStarted","Data":"fad824f7b5fee3d444e118945e0672bdf2542d79dd03202a7d2c8f691e9876e1"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.719584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" event={"ID":"2d8e31df-9566-48bf-9488-dfe398d5bb12","Type":"ContainerStarted","Data":"99fa0a3cb66f8012a0116ae3bf1474b5e7544229a7240d90de9c29f2fe365791"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.719647 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" podStartSLOduration=120.719621966 podStartE2EDuration="2m0.719621966s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.704221924 +0000 UTC m=+146.188450829" watchObservedRunningTime="2026-01-29 03:29:52.719621966 +0000 UTC m=+146.203850871" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.735383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.739657 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.239630547 +0000 UTC m=+146.723859452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.742014 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" podStartSLOduration=120.741965957 podStartE2EDuration="2m0.741965957s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.740269116 +0000 UTC m=+146.224498021" watchObservedRunningTime="2026-01-29 03:29:52.741965957 +0000 UTC m=+146.226194852" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.767895 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" podStartSLOduration=120.767871755 podStartE2EDuration="2m0.767871755s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.764926477 +0000 UTC m=+146.249155382" watchObservedRunningTime="2026-01-29 03:29:52.767871755 +0000 UTC m=+146.252100660" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.768722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.775831 4707 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dqhzd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.775912 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" podUID="aa8df0a1-21ab-42b5-92fe-a444e09a0416" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.786158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6t9xh" event={"ID":"2798ce2d-9125-464c-8001-03c3e9f65af7","Type":"ContainerStarted","Data":"e940b4507f3e7801107fdd7377286693ad273aedd145b49f4aa8280811c4ff30"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.791867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" event={"ID":"32a45220-2ae6-4b62-80ae-8c82833390a4","Type":"ContainerStarted","Data":"3e6a3668c67029fdc28ed5fed79253db1afae608c113def3a788a1e6dd1c766b"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.792032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" event={"ID":"32a45220-2ae6-4b62-80ae-8c82833390a4","Type":"ContainerStarted","Data":"4ca2fd5c63ef3990ffd038ef179225e0ae8ed08db57010d9de92782317a3f3c5"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.794281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml" event={"ID":"aa6f96d8-64db-4d31-b07f-a933cca2d98f","Type":"ContainerStarted","Data":"451b738970eedf2448749ca2391a227e6a04f2d1c3e521008c05ec251ba7d343"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.801266 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" podStartSLOduration=120.801245618 podStartE2EDuration="2m0.801245618s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.797923208 +0000 UTC m=+146.282152123" watchObservedRunningTime="2026-01-29 03:29:52.801245618 +0000 UTC m=+146.285474523" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.809212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" event={"ID":"94e68ae3-2da0-4fd9-a745-840760dd4efe","Type":"ContainerStarted","Data":"e54efd5537177093a55b5c1c95ddbc8b42f9cbadf9f73cf058fdef7857e33ffb"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.809280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" event={"ID":"94e68ae3-2da0-4fd9-a745-840760dd4efe","Type":"ContainerStarted","Data":"2c2cb2d2e6e00b8eb6502e259f5c08baccf9e5048f18155f8e53fd7a94e5f8a7"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.809842 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.811335 4707 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-57mk4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.811384 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" podUID="94e68ae3-2da0-4fd9-a745-840760dd4efe" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.817547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" event={"ID":"09878060-fe42-40c3-b8b2-4392225b3669","Type":"ContainerStarted","Data":"4a4d386ff8c01b3804102abe25f1745d85e482f92911a3bf19884a882068baba"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.826339 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hp957" podStartSLOduration=120.8263026 podStartE2EDuration="2m0.8263026s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.825590959 +0000 UTC m=+146.309819884" watchObservedRunningTime="2026-01-29 03:29:52.8263026 +0000 UTC m=+146.310531505" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.836606 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.839771 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.339748104 +0000 UTC m=+146.823977009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.855716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" event={"ID":"daef8de5-6c32-460e-8830-884671338aca","Type":"ContainerStarted","Data":"b6fcdadcf5cdbaf208829fe39985dbdc774cd484c75f9fbf96aed4b0598d4338"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.873103 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jr7sf" podStartSLOduration=120.873089375 podStartE2EDuration="2m0.873089375s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.871695753 +0000 UTC m=+146.355924658" watchObservedRunningTime="2026-01-29 03:29:52.873089375 +0000 UTC m=+146.357318280" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.873555 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-srpcb" podStartSLOduration=120.873551439 podStartE2EDuration="2m0.873551439s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.849159007 +0000 UTC m=+146.333387912" watchObservedRunningTime="2026-01-29 03:29:52.873551439 +0000 UTC m=+146.357780344" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.873747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" event={"ID":"843bc0fd-4ef0-4f01-b1a1-b1281063a3dc","Type":"ContainerStarted","Data":"e5ba7f32f5ca7ae6574f9db06179cbb572155387a4cfe33d6d4beabbcdd917df"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.897379 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" podStartSLOduration=120.897362294 podStartE2EDuration="2m0.897362294s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.895027204 +0000 UTC m=+146.379256109" watchObservedRunningTime="2026-01-29 03:29:52.897362294 +0000 UTC m=+146.381591199" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.920183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" event={"ID":"52c7f103-7152-484c-aff9-da45a3f8ac20","Type":"ContainerStarted","Data":"cbe1af094e4b8aaa1d899c942337e14344699146ee41ac6eaa115a2113fe7175"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.936131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" event={"ID":"4afbbde5-74e6-4ce1-bcc1-8d517069e49b","Type":"ContainerStarted","Data":"5386c70fade5c45e0e8db8293e38f8ec5df66d0d6d9800eb854fea2f71dc2558"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.936729 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.940741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" event={"ID":"8a2523fe-5501-417e-9d8b-85e936ed840c","Type":"ContainerStarted","Data":"66cda181331724c61b7b6ac6d67aa4808c1e67404a247bd02e6519ebfb0d7c38"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.944320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" event={"ID":"90329817-eb42-4c0b-8e57-908c60c1db50","Type":"ContainerStarted","Data":"be0fe40dc9fcef1643f26a26bf900b9dbac84da5ec17a2986af0d3cc9dac16f0"} Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.944445 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:52 crc kubenswrapper[4707]: E0129 03:29:52.946395 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.446381086 +0000 UTC m=+146.930609991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.947887 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-snpzw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.947920 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-snpzw" podUID="c193fefb-9fdb-479f-851b-4f1fd4c9d087" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 29 03:29:52 crc kubenswrapper[4707]: I0129 03:29:52.954856 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" podStartSLOduration=120.95484182 podStartE2EDuration="2m0.95484182s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.929717946 +0000 UTC m=+146.413946851" watchObservedRunningTime="2026-01-29 03:29:52.95484182 +0000 UTC m=+146.439070725" Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.009788 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" podStartSLOduration=121.00977163 podStartE2EDuration="2m1.00977163s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:52.980923634 +0000 UTC m=+146.465152539" watchObservedRunningTime="2026-01-29 03:29:53.00977163 +0000 UTC m=+146.494000535" Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.012420 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vt6tg" podStartSLOduration=121.012412139 podStartE2EDuration="2m1.012412139s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:53.00911664 +0000 UTC m=+146.493345545" watchObservedRunningTime="2026-01-29 03:29:53.012412139 +0000 UTC m=+146.496641044" Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.050581 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.052160 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.552142773 +0000 UTC m=+147.036371678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.068842 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" podStartSLOduration=121.068815203 podStartE2EDuration="2m1.068815203s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:53.067505264 +0000 UTC m=+146.551734169" watchObservedRunningTime="2026-01-29 03:29:53.068815203 +0000 UTC m=+146.553044128" Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.072376 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" podStartSLOduration=121.07235475 podStartE2EDuration="2m1.07235475s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:53.04373395 +0000 UTC m=+146.527962875" watchObservedRunningTime="2026-01-29 03:29:53.07235475 +0000 UTC m=+146.556583655" Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.152330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.153049 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.653036122 +0000 UTC m=+147.137265027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.254290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.255480 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.755449068 +0000 UTC m=+147.239677983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.356337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.357093 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.85707673 +0000 UTC m=+147.341305635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.458474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.459025 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.958990571 +0000 UTC m=+147.443219476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.459488 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.459973 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:53.959949629 +0000 UTC m=+147.444178534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.560689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.561386 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.061358304 +0000 UTC m=+147.545587219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.662849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.663289 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.163257014 +0000 UTC m=+147.647485919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.705347 4707 patch_prober.go:28] interesting pod/router-default-5444994796-f7lpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 03:29:53 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 29 03:29:53 crc kubenswrapper[4707]: [+]process-running ok Jan 29 03:29:53 crc kubenswrapper[4707]: healthz check failed Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.705789 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f7lpb" podUID="c540dd44-ac44-44c6-8e9c-6e1d1890444e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.764236 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.764444 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.264409212 +0000 UTC m=+147.748638117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.764997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.765428 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.265414392 +0000 UTC m=+147.749643287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.866916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.867179 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.367134187 +0000 UTC m=+147.851363092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.867439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.867881 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.367866439 +0000 UTC m=+147.852095344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.958996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" event={"ID":"9194d298-b1b5-4b06-9254-b484dc1a1382","Type":"ContainerStarted","Data":"35f369920fccfbd42076ae282332aa0acfcfc5b674689300e1a416cf0deb5c7b"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.960928 4707 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-k6sdj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.39:6443/healthz\": dial tcp 10.217.0.39:6443: connect: connection refused" start-of-body= Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.961045 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.39:6443/healthz\": dial tcp 10.217.0.39:6443: connect: connection refused" Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.965442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" event={"ID":"8a2523fe-5501-417e-9d8b-85e936ed840c","Type":"ContainerStarted","Data":"d40a66ec96f5ca57e43ff814f15667bfd22c3643ec24b11e058c97106c72c348"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.968083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:53 crc kubenswrapper[4707]: E0129 03:29:53.969037 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.468992466 +0000 UTC m=+147.953221371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.976155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7p8hx" event={"ID":"48412975-a793-44ee-a22c-dc7ad4145451","Type":"ContainerStarted","Data":"48642042499f21f8c9882c90a18cf82ae712b7d1c6c31915036ed0e841607d4e"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.979872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" event={"ID":"09878060-fe42-40c3-b8b2-4392225b3669","Type":"ContainerStarted","Data":"7dc7f1d5c34039b682e2256d9d9990f4248bd2c59e6885a76affec9debd06b0b"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.982806 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" event={"ID":"daef8de5-6c32-460e-8830-884671338aca","Type":"ContainerStarted","Data":"61207b62195af461a4d5550917b10dd72833930989b3af14c37ae0488dfaed64"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.984924 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" event={"ID":"23df2202-fce8-4515-b147-1256fe6d953b","Type":"ContainerStarted","Data":"a3a7ab0ae14765a86908c7e89252912e627183ec01704fead8f8fe12e39509de"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.985488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" event={"ID":"23df2202-fce8-4515-b147-1256fe6d953b","Type":"ContainerStarted","Data":"20d39de7f4fd0e05472abaf9d6d997e062d847d2025b7990fb380c4ff3bd0a30"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.986645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" event={"ID":"4afbbde5-74e6-4ce1-bcc1-8d517069e49b","Type":"ContainerStarted","Data":"1c20ef644e585fdb7fecdeb48bc604a8d00ac26e8848d520127e68efdc1e3a1f"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.987926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hp957" event={"ID":"53bfd3ca-7447-44cf-af4c-165db1f5e7be","Type":"ContainerStarted","Data":"22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.989444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" event={"ID":"74f4c17e-e02b-4971-9341-db1810cb5192","Type":"ContainerStarted","Data":"4c16c61bb276eb7cd883a03b9f4aab4c675cdee43c1d0820f38c7506427e6108"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.989473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" event={"ID":"74f4c17e-e02b-4971-9341-db1810cb5192","Type":"ContainerStarted","Data":"4dfd3b2ea93a747aee90c5197be12fc011b06a236bd0ca4e4aed86a5588caa29"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.991325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7h7rj" event={"ID":"843bc0fd-4ef0-4f01-b1a1-b1281063a3dc","Type":"ContainerStarted","Data":"deccaddf795639ac2c144efe7581d710711d21a42ad029d6ef6121d9624690bb"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.993398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml" event={"ID":"aa6f96d8-64db-4d31-b07f-a933cca2d98f","Type":"ContainerStarted","Data":"dc5df11ba61b18a3bb756e60d1b83f6c430d75dd8f9e2e869ffcb351e2e382ff"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.993424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml" event={"ID":"aa6f96d8-64db-4d31-b07f-a933cca2d98f","Type":"ContainerStarted","Data":"168aa055df11e2c5af7a0ba973e4dfad1affed078ef36daef97f0d16d889b064"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.994844 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sfkbk" event={"ID":"4d9bdeda-5d83-4e63-bce4-527dd6aea51e","Type":"ContainerStarted","Data":"422f6db3c0da6b978f94e96f0af33b02d9fd7beda5d5d91111fa215bad18bfa7"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.994867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sfkbk" event={"ID":"4d9bdeda-5d83-4e63-bce4-527dd6aea51e","Type":"ContainerStarted","Data":"cd8bfd9d89f5ba8373a2bc7cae38d1680c10c6cf2985f7f13b58409c8e892a68"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.996626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" event={"ID":"96781d63-5e82-4529-9051-c3b5c9a8175f","Type":"ContainerStarted","Data":"57b0aec65e14f3ace70c557f1e22ce5d58a744b809ae5d921ed68618706c43f1"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.997807 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7v2mk" event={"ID":"b20b69c3-e242-41ec-b726-c39c4338f7ba","Type":"ContainerStarted","Data":"1d4449ae0885154dbe3ddb15e7bccbafeeb668460e9566a0fa8a652ce39c70ea"} Jan 29 03:29:53 crc kubenswrapper[4707]: I0129 03:29:53.999941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" event={"ID":"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007","Type":"ContainerStarted","Data":"7b55aa217af136d1609cc5e83344365ccfe25a8dd2b8e26dbd2946b3b15ae94d"} Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.000075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" event={"ID":"ce8e2e6d-c59e-4be6-a36c-d55ce4bdf007","Type":"ContainerStarted","Data":"746135489dfcd5525f2b9d366f379d11f3ec46b4ef66c59b3e3c033fe1941e67"} Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.002119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" event={"ID":"f7a06628-39ad-42db-8dec-259a64cc9947","Type":"ContainerStarted","Data":"cff69f3651a6f4ac9867b2afc5f903305e4f1d5fac42c7d6da64ddec4e7aee6e"} Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.004491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" event={"ID":"5251706a-66cb-47ad-a87f-047d3d252bd6","Type":"ContainerStarted","Data":"89cbd005de841fdd48b0624af137f29e4ccc95bfbb3b1718221f60fd9e994ddb"} Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.006364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" event={"ID":"605d630d-8166-4b8e-8594-80ed781b8b9d","Type":"ContainerStarted","Data":"697015dc32cf3db88ab719eeff47bdc931bf07da0f96b11c47c4a9965972d438"} Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.008143 4707 generic.go:334] "Generic (PLEG): container finished" podID="2911ab30-6b8e-4e2f-8bb2-2d7270d16c33" containerID="637e71ed438515e436260cdccc83c8a985c0b76a5c95871d259c2705f2bdbbf4" exitCode=0 Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.008296 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" event={"ID":"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33","Type":"ContainerDied","Data":"637e71ed438515e436260cdccc83c8a985c0b76a5c95871d259c2705f2bdbbf4"} Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.012967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" event={"ID":"aa8df0a1-21ab-42b5-92fe-a444e09a0416","Type":"ContainerStarted","Data":"26706989b3df8f41d2583af76e3326df0f0affddbc1c6ba728846bd7fe74a4de"} Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.013713 4707 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dqhzd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.013743 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" podUID="aa8df0a1-21ab-42b5-92fe-a444e09a0416" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.019683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9tjzd" event={"ID":"6f82d8d3-27e8-476a-a7dd-997fbb5ce6fb","Type":"ContainerStarted","Data":"d45ea70d919b516a53afced5d4598c8c9dd89d7cf45b1c6a7a1681d4fe8680a0"} Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.020836 4707 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-b9hgf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.020870 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" podUID="f5aa9908-9832-4df9-b0b3-5c4e466fac2f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.021755 4707 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-57mk4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.021779 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" podUID="94e68ae3-2da0-4fd9-a745-840760dd4efe" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.021900 4707 patch_prober.go:28] interesting pod/console-operator-58897d9998-h9hdt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.021932 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h9hdt" podUID="cf7f47e6-91e0-4084-b586-69208aba0921" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.023736 4707 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tpsmp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.023783 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" podUID="eeef9237-bd0b-494d-a3a0-8b6e54baa03e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.073960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.074397 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.574385121 +0000 UTC m=+148.058614026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.083333 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwwcm" podStartSLOduration=122.083316549 podStartE2EDuration="2m2.083316549s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.031465042 +0000 UTC m=+147.515693947" watchObservedRunningTime="2026-01-29 03:29:54.083316549 +0000 UTC m=+147.567545444" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.084813 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8bkh7" podStartSLOduration=122.084807414 podStartE2EDuration="2m2.084807414s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.083744952 +0000 UTC m=+147.567973857" watchObservedRunningTime="2026-01-29 03:29:54.084807414 +0000 UTC m=+147.569036319" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.141306 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sfkbk" podStartSLOduration=9.14128923 podStartE2EDuration="9.14128923s" podCreationTimestamp="2026-01-29 03:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.134561978 +0000 UTC m=+147.618790883" watchObservedRunningTime="2026-01-29 03:29:54.14128923 +0000 UTC m=+147.625518135" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.175394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.179826 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.679807207 +0000 UTC m=+148.164036112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.189122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.203087 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.703071055 +0000 UTC m=+148.187299960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.293416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.293867 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.793850162 +0000 UTC m=+148.278079067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.397224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.397872 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:54.897860295 +0000 UTC m=+148.382089200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.461146 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s6tr9" podStartSLOduration=122.461121595 podStartE2EDuration="2m2.461121595s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.292302055 +0000 UTC m=+147.776530960" watchObservedRunningTime="2026-01-29 03:29:54.461121595 +0000 UTC m=+147.945350490" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.461395 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" podStartSLOduration=122.461389623 podStartE2EDuration="2m2.461389623s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.43830009 +0000 UTC m=+147.922529015" watchObservedRunningTime="2026-01-29 03:29:54.461389623 +0000 UTC m=+147.945618528" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.500166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.500832 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.000818317 +0000 UTC m=+148.485047222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.563496 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hhlzr" podStartSLOduration=122.563481749 podStartE2EDuration="2m2.563481749s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.5202087 +0000 UTC m=+148.004437615" watchObservedRunningTime="2026-01-29 03:29:54.563481749 +0000 UTC m=+148.047710654" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.563827 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fzdml" podStartSLOduration=122.563820669 podStartE2EDuration="2m2.563820669s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.562755417 +0000 UTC m=+148.046984312" watchObservedRunningTime="2026-01-29 03:29:54.563820669 +0000 UTC m=+148.048049574" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.603691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.603978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.604082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.604163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.604253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.605447 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.609693 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pgx7m" podStartSLOduration=122.609682627 podStartE2EDuration="2m2.609682627s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.606916454 +0000 UTC m=+148.091145359" watchObservedRunningTime="2026-01-29 03:29:54.609682627 +0000 UTC m=+148.093911532" Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.610480 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.11046897 +0000 UTC m=+148.594697875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.622984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.625408 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.636150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.662818 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.668769 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwgsz" podStartSLOduration=122.66875274 podStartE2EDuration="2m2.66875274s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.660666188 +0000 UTC m=+148.144895093" watchObservedRunningTime="2026-01-29 03:29:54.66875274 +0000 UTC m=+148.152981645" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.670858 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.701717 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtt54" podStartSLOduration=122.70170083 podStartE2EDuration="2m2.70170083s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.701068261 +0000 UTC m=+148.185297166" watchObservedRunningTime="2026-01-29 03:29:54.70170083 +0000 UTC m=+148.185929735" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.704987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.705397 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.205383561 +0000 UTC m=+148.689612466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.717873 4707 patch_prober.go:28] interesting pod/router-default-5444994796-f7lpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 03:29:54 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 29 03:29:54 crc kubenswrapper[4707]: [+]process-running ok Jan 29 03:29:54 crc kubenswrapper[4707]: healthz check failed Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.718335 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f7lpb" podUID="c540dd44-ac44-44c6-8e9c-6e1d1890444e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.788968 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.791415 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khm86" podStartSLOduration=122.791400164 podStartE2EDuration="2m2.791400164s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.753961139 +0000 UTC m=+148.238190044" watchObservedRunningTime="2026-01-29 03:29:54.791400164 +0000 UTC m=+148.275629069" Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.807116 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.807455 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.307442086 +0000 UTC m=+148.791670991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.911346 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:54 crc kubenswrapper[4707]: E0129 03:29:54.912029 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.412013396 +0000 UTC m=+148.896242301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.951343 4707 csr.go:261] certificate signing request csr-hlv4h is approved, waiting to be issued Jan 29 03:29:54 crc kubenswrapper[4707]: I0129 03:29:54.976805 4707 csr.go:257] certificate signing request csr-hlv4h is issued Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.016478 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.016881 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.516868595 +0000 UTC m=+149.001097500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.117298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.117950 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.61791952 +0000 UTC m=+149.102148425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.127599 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-sfkbk" Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.142325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" event={"ID":"2911ab30-6b8e-4e2f-8bb2-2d7270d16c33","Type":"ContainerStarted","Data":"47453b5b12703e6867f07dd00224bbdd04bd88ce6bd7df3522ea1a8eb37a0c93"} Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.192794 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsm6m" podStartSLOduration=123.192776988 podStartE2EDuration="2m3.192776988s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:54.794137466 +0000 UTC m=+148.278366371" watchObservedRunningTime="2026-01-29 03:29:55.192776988 +0000 UTC m=+148.677005893" Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.192927 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" podStartSLOduration=123.192923282 podStartE2EDuration="2m3.192923282s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:55.191078357 +0000 UTC m=+148.675307262" watchObservedRunningTime="2026-01-29 03:29:55.192923282 +0000 UTC m=+148.677152187" Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.225515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.229930 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.729900553 +0000 UTC m=+149.214129458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.282268 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.326327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.326670 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.826643028 +0000 UTC m=+149.310871933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.326969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.327328 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.827321178 +0000 UTC m=+149.311550083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.431096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.431457 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:55.931442515 +0000 UTC m=+149.415671420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.534503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.535046 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.035025366 +0000 UTC m=+149.519254271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: W0129 03:29:55.586197 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-a1ef742026ea1c52127fd7b3c0cdcd7e75ad43b2492b7ac0e4ed052a9084f234 WatchSource:0}: Error finding container a1ef742026ea1c52127fd7b3c0cdcd7e75ad43b2492b7ac0e4ed052a9084f234: Status 404 returned error can't find the container with id a1ef742026ea1c52127fd7b3c0cdcd7e75ad43b2492b7ac0e4ed052a9084f234 Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.644285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.644769 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.144746551 +0000 UTC m=+149.628975456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.715466 4707 patch_prober.go:28] interesting pod/router-default-5444994796-f7lpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 03:29:55 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 29 03:29:55 crc kubenswrapper[4707]: [+]process-running ok Jan 29 03:29:55 crc kubenswrapper[4707]: healthz check failed Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.715561 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f7lpb" podUID="c540dd44-ac44-44c6-8e9c-6e1d1890444e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.748527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.749185 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.249143126 +0000 UTC m=+149.733372031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.850340 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.850422 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.350405827 +0000 UTC m=+149.834634732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.850990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.851395 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.351375246 +0000 UTC m=+149.835604341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.952074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:55 crc kubenswrapper[4707]: E0129 03:29:55.952563 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.452505513 +0000 UTC m=+149.936734418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.967672 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-h9hdt" Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.979620 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-29 03:24:54 +0000 UTC, rotation deadline is 2026-10-14 12:46:04.329583159 +0000 UTC Jan 29 03:29:55 crc kubenswrapper[4707]: I0129 03:29:55.979655 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6201h16m8.34993025s for next certificate rotation Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.053492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.053843 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.553830786 +0000 UTC m=+150.038059691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.145982 4707 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dqhzd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.146581 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" podUID="aa8df0a1-21ab-42b5-92fe-a444e09a0416" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.147307 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nhjg7"] Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.148455 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.154196 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.154392 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.654356425 +0000 UTC m=+150.138585330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.154473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.154819 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.654805069 +0000 UTC m=+150.139033974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.164899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ef63f5711ccacc3959d95001108214d55f4b1c8c66601ec814136e89ddb0507b"} Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.164959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"da987e6f27a83a7a4e2866c21f654f41fe2c4531c028a338e44f11e8dfe26158"} Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.188568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" event={"ID":"96781d63-5e82-4529-9051-c3b5c9a8175f","Type":"ContainerStarted","Data":"6286a89afb3b7997040c63fa1a32461e5bf3da4ff0a6d52019a6dd8fdcf0026b"} Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.188648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" event={"ID":"96781d63-5e82-4529-9051-c3b5c9a8175f","Type":"ContainerStarted","Data":"8560aeef7d0ecec0823ebee5acba8d6eb3caf690efaca43a2d4db48ba2d88037"} Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.189007 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.206069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"73cdf34063b9707e83123b265d10ac550a47b07a5a3f4f19a080a5fcae771731"} Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.206136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"38b0fb4fb559584b5d37d290c0122aebe7f0534d2f33c85f3a77823dab42f1fa"} Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.207018 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.213584 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nhjg7"] Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.224137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a1ef742026ea1c52127fd7b3c0cdcd7e75ad43b2492b7ac0e4ed052a9084f234"} Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.256949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.257200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-catalog-content\") pod \"certified-operators-nhjg7\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.257287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-utilities\") pod \"certified-operators-nhjg7\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.257341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llwfv\" (UniqueName: \"kubernetes.io/projected/a832dac2-976f-45e7-adc9-fc29666d0721-kube-api-access-llwfv\") pod \"certified-operators-nhjg7\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.257584 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.757561675 +0000 UTC m=+150.241790580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.262822 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.314751 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xrvfw"] Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.315737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.324779 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.332965 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.358520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-utilities\") pod \"community-operators-xrvfw\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.358771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-catalog-content\") pod \"community-operators-xrvfw\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.358815 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-catalog-content\") pod \"certified-operators-nhjg7\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.358915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.358947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-utilities\") pod \"certified-operators-nhjg7\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.359008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-568r6\" (UniqueName: \"kubernetes.io/projected/b4d4ff70-611c-4a65-982c-f551baa66bd5-kube-api-access-568r6\") pod \"community-operators-xrvfw\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.359029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llwfv\" (UniqueName: \"kubernetes.io/projected/a832dac2-976f-45e7-adc9-fc29666d0721-kube-api-access-llwfv\") pod \"certified-operators-nhjg7\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.360926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-utilities\") pod \"certified-operators-nhjg7\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.360951 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.860928759 +0000 UTC m=+150.345157664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.361136 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-catalog-content\") pod \"certified-operators-nhjg7\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.383372 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrvfw"] Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.421644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llwfv\" (UniqueName: \"kubernetes.io/projected/a832dac2-976f-45e7-adc9-fc29666d0721-kube-api-access-llwfv\") pod \"certified-operators-nhjg7\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.465428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.465667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-utilities\") pod \"community-operators-xrvfw\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.465729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-catalog-content\") pod \"community-operators-xrvfw\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.465793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-568r6\" (UniqueName: \"kubernetes.io/projected/b4d4ff70-611c-4a65-982c-f551baa66bd5-kube-api-access-568r6\") pod \"community-operators-xrvfw\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.466294 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:56.966270632 +0000 UTC m=+150.450499537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.466735 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-utilities\") pod \"community-operators-xrvfw\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.466956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-catalog-content\") pod \"community-operators-xrvfw\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.493078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.569645 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ss5sc"] Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.572297 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.577228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.577714 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:57.077699859 +0000 UTC m=+150.561928764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.596501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-568r6\" (UniqueName: \"kubernetes.io/projected/b4d4ff70-611c-4a65-982c-f551baa66bd5-kube-api-access-568r6\") pod \"community-operators-xrvfw\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.658394 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ss5sc"] Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.658896 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.679074 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.679286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc7k5\" (UniqueName: \"kubernetes.io/projected/c24271ec-27aa-4e94-8244-9b05496687ee-kube-api-access-hc7k5\") pod \"certified-operators-ss5sc\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.679356 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-utilities\") pod \"certified-operators-ss5sc\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.679396 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-catalog-content\") pod \"certified-operators-ss5sc\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.679493 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:57.179476415 +0000 UTC m=+150.663705320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.713830 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bn9xb"] Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.715361 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.738253 4707 patch_prober.go:28] interesting pod/router-default-5444994796-f7lpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 03:29:56 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 29 03:29:56 crc kubenswrapper[4707]: [+]process-running ok Jan 29 03:29:56 crc kubenswrapper[4707]: healthz check failed Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.738316 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f7lpb" podUID="c540dd44-ac44-44c6-8e9c-6e1d1890444e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.761472 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bn9xb"] Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.783856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-catalog-content\") pod \"community-operators-bn9xb\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.783908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-utilities\") pod \"certified-operators-ss5sc\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.783929 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-utilities\") pod \"community-operators-bn9xb\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.783964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.783997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-catalog-content\") pod \"certified-operators-ss5sc\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.784028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc7k5\" (UniqueName: \"kubernetes.io/projected/c24271ec-27aa-4e94-8244-9b05496687ee-kube-api-access-hc7k5\") pod \"certified-operators-ss5sc\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.784076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4hxl\" (UniqueName: \"kubernetes.io/projected/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-kube-api-access-m4hxl\") pod \"community-operators-bn9xb\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.784845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-utilities\") pod \"certified-operators-ss5sc\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.785217 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:57.2851965 +0000 UTC m=+150.769425405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.785469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-catalog-content\") pod \"certified-operators-ss5sc\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.815212 4707 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.849243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc7k5\" (UniqueName: \"kubernetes.io/projected/c24271ec-27aa-4e94-8244-9b05496687ee-kube-api-access-hc7k5\") pod \"certified-operators-ss5sc\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.896993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.897287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-catalog-content\") pod \"community-operators-bn9xb\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.897315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-utilities\") pod \"community-operators-bn9xb\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.897389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4hxl\" (UniqueName: \"kubernetes.io/projected/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-kube-api-access-m4hxl\") pod \"community-operators-bn9xb\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: E0129 03:29:56.897465 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:57.397430931 +0000 UTC m=+150.881659836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.898170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-catalog-content\") pod \"community-operators-bn9xb\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.898209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-utilities\") pod \"community-operators-bn9xb\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.924045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4hxl\" (UniqueName: \"kubernetes.io/projected/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-kube-api-access-m4hxl\") pod \"community-operators-bn9xb\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.951735 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.971008 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:56 crc kubenswrapper[4707]: I0129 03:29:56.985591 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-kkzfz" Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.001658 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:57 crc kubenswrapper[4707]: E0129 03:29:57.002066 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:57.502050743 +0000 UTC m=+150.986279648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.016941 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nhjg7"] Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.104067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:57 crc kubenswrapper[4707]: E0129 03:29:57.105796 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:57.605773357 +0000 UTC m=+151.090002262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.120643 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrvfw"] Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.129918 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:29:57 crc kubenswrapper[4707]: W0129 03:29:57.136865 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4d4ff70_611c_4a65_982c_f551baa66bd5.slice/crio-8040e8d500b2719590accb1880bb964170067879cf740a47dab9641e2a62f7f8 WatchSource:0}: Error finding container 8040e8d500b2719590accb1880bb964170067879cf740a47dab9641e2a62f7f8: Status 404 returned error can't find the container with id 8040e8d500b2719590accb1880bb964170067879cf740a47dab9641e2a62f7f8 Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.206178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:57 crc kubenswrapper[4707]: E0129 03:29:57.206975 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:57.706921714 +0000 UTC m=+151.191150619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.308722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:57 crc kubenswrapper[4707]: E0129 03:29:57.309147 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 03:29:57.809124883 +0000 UTC m=+151.293353788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.309639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" event={"ID":"96781d63-5e82-4529-9051-c3b5c9a8175f","Type":"ContainerStarted","Data":"d3772bf90d3213c360f9c20ac5d0cf13bf91da8dedfe26ad68592fe01ab2ae12"} Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.321452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjg7" event={"ID":"a832dac2-976f-45e7-adc9-fc29666d0721","Type":"ContainerStarted","Data":"9bb06d26a551ddb3d1ff3c417b44f38f47b2b8451e9cc22b56cc6733b9a1bd6d"} Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.323290 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.324016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrvfw" event={"ID":"b4d4ff70-611c-4a65-982c-f551baa66bd5","Type":"ContainerStarted","Data":"8040e8d500b2719590accb1880bb964170067879cf740a47dab9641e2a62f7f8"} Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.334979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e8bfa639609ddc960d516e04c3296f054852226ea3d75c0e4e06b63aab13f4b7"} Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.357838 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7vfwn" podStartSLOduration=11.357813576 podStartE2EDuration="11.357813576s" podCreationTimestamp="2026-01-29 03:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:57.355749654 +0000 UTC m=+150.839978559" watchObservedRunningTime="2026-01-29 03:29:57.357813576 +0000 UTC m=+150.842042471" Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.402954 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ss5sc"] Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.425989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:57 crc kubenswrapper[4707]: E0129 03:29:57.430082 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 03:29:57.930061065 +0000 UTC m=+151.414289970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z575b" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.443850 4707 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-29T03:29:56.815242552Z","Handler":null,"Name":""} Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.462763 4707 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.462821 4707 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.529313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.537040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.550673 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bn9xb"] Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.606554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.631498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.640706 4707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.640766 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.648884 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cbr69" Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.681179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z575b\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.705192 4707 patch_prober.go:28] interesting pod/router-default-5444994796-f7lpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 03:29:57 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 29 03:29:57 crc kubenswrapper[4707]: [+]process-running ok Jan 29 03:29:57 crc kubenswrapper[4707]: healthz check failed Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.705286 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f7lpb" podUID="c540dd44-ac44-44c6-8e9c-6e1d1890444e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:57 crc kubenswrapper[4707]: I0129 03:29:57.920314 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.165485 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.166796 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.169347 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.169837 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.182943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.241882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.242076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.245234 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z575b"] Jan 29 03:29:58 crc kubenswrapper[4707]: W0129 03:29:58.271447 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe4f03a9_cb43_4405_902b_eb2cdb645eb8.slice/crio-f1cd557f639d6178c7eb2d82cc29114b1330785da91b11e2283763b91307715a WatchSource:0}: Error finding container f1cd557f639d6178c7eb2d82cc29114b1330785da91b11e2283763b91307715a: Status 404 returned error can't find the container with id f1cd557f639d6178c7eb2d82cc29114b1330785da91b11e2283763b91307715a Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.275903 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hb89x"] Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.277309 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.282609 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.283905 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.284592 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.297866 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hb89x"] Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.304481 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.323833 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-snpzw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.323930 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-snpzw" podUID="c193fefb-9fdb-479f-851b-4f1fd4c9d087" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.323988 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-snpzw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.324068 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-snpzw" podUID="c193fefb-9fdb-479f-851b-4f1fd4c9d087" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.342934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmxt7\" (UniqueName: \"kubernetes.io/projected/ad1e990a-db38-4eb8-8ae9-6bd700728e48-kube-api-access-fmxt7\") pod \"redhat-marketplace-hb89x\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.342994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.343066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.343102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-utilities\") pod \"redhat-marketplace-hb89x\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.343135 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-catalog-content\") pod \"redhat-marketplace-hb89x\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.344683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.384749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.392173 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerID="ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5" exitCode=0 Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.392494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrvfw" event={"ID":"b4d4ff70-611c-4a65-982c-f551baa66bd5","Type":"ContainerDied","Data":"ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5"} Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.406322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" event={"ID":"fe4f03a9-cb43-4405-902b-eb2cdb645eb8","Type":"ContainerStarted","Data":"f1cd557f639d6178c7eb2d82cc29114b1330785da91b11e2283763b91307715a"} Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.413695 4707 generic.go:334] "Generic (PLEG): container finished" podID="8a2523fe-5501-417e-9d8b-85e936ed840c" containerID="d40a66ec96f5ca57e43ff814f15667bfd22c3643ec24b11e058c97106c72c348" exitCode=0 Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.413857 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" event={"ID":"8a2523fe-5501-417e-9d8b-85e936ed840c","Type":"ContainerDied","Data":"d40a66ec96f5ca57e43ff814f15667bfd22c3643ec24b11e058c97106c72c348"} Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.431613 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerID="a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af" exitCode=0 Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.431705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bn9xb" event={"ID":"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e","Type":"ContainerDied","Data":"a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af"} Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.431739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bn9xb" event={"ID":"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e","Type":"ContainerStarted","Data":"3e147951199e2f5e859feb739713c30e574ec20201de21da875f1735f63a9379"} Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.453296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmxt7\" (UniqueName: \"kubernetes.io/projected/ad1e990a-db38-4eb8-8ae9-6bd700728e48-kube-api-access-fmxt7\") pod \"redhat-marketplace-hb89x\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.453395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-utilities\") pod \"redhat-marketplace-hb89x\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.453425 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-catalog-content\") pod \"redhat-marketplace-hb89x\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.453899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-catalog-content\") pod \"redhat-marketplace-hb89x\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.454123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-utilities\") pod \"redhat-marketplace-hb89x\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.460418 4707 generic.go:334] "Generic (PLEG): container finished" podID="a832dac2-976f-45e7-adc9-fc29666d0721" containerID="aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf" exitCode=0 Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.460691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjg7" event={"ID":"a832dac2-976f-45e7-adc9-fc29666d0721","Type":"ContainerDied","Data":"aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf"} Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.483092 4707 generic.go:334] "Generic (PLEG): container finished" podID="c24271ec-27aa-4e94-8244-9b05496687ee" containerID="feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6" exitCode=0 Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.484192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ss5sc" event={"ID":"c24271ec-27aa-4e94-8244-9b05496687ee","Type":"ContainerDied","Data":"feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6"} Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.484284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ss5sc" event={"ID":"c24271ec-27aa-4e94-8244-9b05496687ee","Type":"ContainerStarted","Data":"755e939f4c507d4999a21c7ac8cc0bf2ceb3c710d02e8e8a2ae22c3d414b60a5"} Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.489757 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.496201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmxt7\" (UniqueName: \"kubernetes.io/projected/ad1e990a-db38-4eb8-8ae9-6bd700728e48-kube-api-access-fmxt7\") pod \"redhat-marketplace-hb89x\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.516855 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j2nxf" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.606199 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.681693 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ljzgq"] Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.683185 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.701412 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.703018 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ljzgq"] Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.709195 4707 patch_prober.go:28] interesting pod/router-default-5444994796-f7lpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 03:29:58 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Jan 29 03:29:58 crc kubenswrapper[4707]: [+]process-running ok Jan 29 03:29:58 crc kubenswrapper[4707]: healthz check failed Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.709297 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f7lpb" podUID="c540dd44-ac44-44c6-8e9c-6e1d1890444e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.757118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.757167 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.759754 4707 patch_prober.go:28] interesting pod/console-f9d7485db-hp957 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.759846 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hp957" podUID="53bfd3ca-7447-44cf-af4c-165db1f5e7be" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.762673 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vv8w\" (UniqueName: \"kubernetes.io/projected/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-kube-api-access-9vv8w\") pod \"redhat-marketplace-ljzgq\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.762774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-catalog-content\") pod \"redhat-marketplace-ljzgq\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.762797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-utilities\") pod \"redhat-marketplace-ljzgq\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.792778 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dqhzd" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.811813 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.867721 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-utilities\") pod \"redhat-marketplace-ljzgq\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.868111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vv8w\" (UniqueName: \"kubernetes.io/projected/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-kube-api-access-9vv8w\") pod \"redhat-marketplace-ljzgq\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.868253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-catalog-content\") pod \"redhat-marketplace-ljzgq\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.868774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-catalog-content\") pod \"redhat-marketplace-ljzgq\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.874035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-utilities\") pod \"redhat-marketplace-ljzgq\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:58 crc kubenswrapper[4707]: I0129 03:29:58.928775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vv8w\" (UniqueName: \"kubernetes.io/projected/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-kube-api-access-9vv8w\") pod \"redhat-marketplace-ljzgq\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.066252 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-b9hgf" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.070998 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.079320 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-57mk4" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.291100 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.292290 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wxdsw"] Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.293487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.298218 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.316484 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxdsw"] Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.384258 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-utilities\") pod \"redhat-operators-wxdsw\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.384301 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v2zk\" (UniqueName: \"kubernetes.io/projected/c814051a-bbf1-4219-8089-8124cb1d3b7b-kube-api-access-4v2zk\") pod \"redhat-operators-wxdsw\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.384584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-catalog-content\") pod \"redhat-operators-wxdsw\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.425856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hb89x"] Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.448822 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 29 03:29:59 crc kubenswrapper[4707]: W0129 03:29:59.452250 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad1e990a_db38_4eb8_8ae9_6bd700728e48.slice/crio-2c854bc9ddb6166e890397dfe4cc7a80d5d5488b307effcebac9de3d33c25899 WatchSource:0}: Error finding container 2c854bc9ddb6166e890397dfe4cc7a80d5d5488b307effcebac9de3d33c25899: Status 404 returned error can't find the container with id 2c854bc9ddb6166e890397dfe4cc7a80d5d5488b307effcebac9de3d33c25899 Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.488104 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-utilities\") pod \"redhat-operators-wxdsw\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.488162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v2zk\" (UniqueName: \"kubernetes.io/projected/c814051a-bbf1-4219-8089-8124cb1d3b7b-kube-api-access-4v2zk\") pod \"redhat-operators-wxdsw\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.488237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-catalog-content\") pod \"redhat-operators-wxdsw\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.488786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-catalog-content\") pod \"redhat-operators-wxdsw\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.489036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-utilities\") pod \"redhat-operators-wxdsw\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.521978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v2zk\" (UniqueName: \"kubernetes.io/projected/c814051a-bbf1-4219-8089-8124cb1d3b7b-kube-api-access-4v2zk\") pod \"redhat-operators-wxdsw\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.527414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" event={"ID":"fe4f03a9-cb43-4405-902b-eb2cdb645eb8","Type":"ContainerStarted","Data":"0be403bbfa67fbb6ba49df6a32247461d85f19802e4c33c7e4807f4cf38656a8"} Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.528527 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.547006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hb89x" event={"ID":"ad1e990a-db38-4eb8-8ae9-6bd700728e48","Type":"ContainerStarted","Data":"2c854bc9ddb6166e890397dfe4cc7a80d5d5488b307effcebac9de3d33c25899"} Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.555348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff","Type":"ContainerStarted","Data":"02afc1edce2824b28d720607b08b7afa07b403caa5cbfb17989e1376e3c6df84"} Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.649073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.672020 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" podStartSLOduration=127.672003394 podStartE2EDuration="2m7.672003394s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:29:59.557845266 +0000 UTC m=+153.042074171" watchObservedRunningTime="2026-01-29 03:29:59.672003394 +0000 UTC m=+153.156232299" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.678006 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gcm8f"] Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.679605 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.711391 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gcm8f"] Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.721849 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.745123 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-f7lpb" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.809142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-utilities\") pod \"redhat-operators-gcm8f\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.809225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-catalog-content\") pod \"redhat-operators-gcm8f\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.809252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ppvl\" (UniqueName: \"kubernetes.io/projected/97319f63-bd16-4f50-b2d3-65340867a92a-kube-api-access-4ppvl\") pod \"redhat-operators-gcm8f\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.866321 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ljzgq"] Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.911113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-catalog-content\") pod \"redhat-operators-gcm8f\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.911166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ppvl\" (UniqueName: \"kubernetes.io/projected/97319f63-bd16-4f50-b2d3-65340867a92a-kube-api-access-4ppvl\") pod \"redhat-operators-gcm8f\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.911308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-utilities\") pod \"redhat-operators-gcm8f\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.911997 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-utilities\") pod \"redhat-operators-gcm8f\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.912301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-catalog-content\") pod \"redhat-operators-gcm8f\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:29:59 crc kubenswrapper[4707]: I0129 03:29:59.950311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ppvl\" (UniqueName: \"kubernetes.io/projected/97319f63-bd16-4f50-b2d3-65340867a92a-kube-api-access-4ppvl\") pod \"redhat-operators-gcm8f\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.110223 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.166515 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh"] Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.178014 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp"] Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.179437 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.181293 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp"] Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.201980 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.217982 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-secret-volume\") pod \"collect-profiles-29494290-z58jp\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.218093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-config-volume\") pod \"collect-profiles-29494290-z58jp\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.218141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4bc\" (UniqueName: \"kubernetes.io/projected/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-kube-api-access-zs4bc\") pod \"collect-profiles-29494290-z58jp\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.319989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2523fe-5501-417e-9d8b-85e936ed840c-secret-volume\") pod \"8a2523fe-5501-417e-9d8b-85e936ed840c\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.320152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9ndf\" (UniqueName: \"kubernetes.io/projected/8a2523fe-5501-417e-9d8b-85e936ed840c-kube-api-access-m9ndf\") pod \"8a2523fe-5501-417e-9d8b-85e936ed840c\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.320228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2523fe-5501-417e-9d8b-85e936ed840c-config-volume\") pod \"8a2523fe-5501-417e-9d8b-85e936ed840c\" (UID: \"8a2523fe-5501-417e-9d8b-85e936ed840c\") " Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.324070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a2523fe-5501-417e-9d8b-85e936ed840c-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a2523fe-5501-417e-9d8b-85e936ed840c" (UID: "8a2523fe-5501-417e-9d8b-85e936ed840c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.324812 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4bc\" (UniqueName: \"kubernetes.io/projected/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-kube-api-access-zs4bc\") pod \"collect-profiles-29494290-z58jp\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.324932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-secret-volume\") pod \"collect-profiles-29494290-z58jp\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.325147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-config-volume\") pod \"collect-profiles-29494290-z58jp\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.325216 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2523fe-5501-417e-9d8b-85e936ed840c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.326260 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-config-volume\") pod \"collect-profiles-29494290-z58jp\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.343594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2523fe-5501-417e-9d8b-85e936ed840c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a2523fe-5501-417e-9d8b-85e936ed840c" (UID: "8a2523fe-5501-417e-9d8b-85e936ed840c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.343761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2523fe-5501-417e-9d8b-85e936ed840c-kube-api-access-m9ndf" (OuterVolumeSpecName: "kube-api-access-m9ndf") pod "8a2523fe-5501-417e-9d8b-85e936ed840c" (UID: "8a2523fe-5501-417e-9d8b-85e936ed840c"). InnerVolumeSpecName "kube-api-access-m9ndf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.347412 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4bc\" (UniqueName: \"kubernetes.io/projected/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-kube-api-access-zs4bc\") pod \"collect-profiles-29494290-z58jp\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.358687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-secret-volume\") pod \"collect-profiles-29494290-z58jp\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.426776 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2523fe-5501-417e-9d8b-85e936ed840c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.426809 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9ndf\" (UniqueName: \"kubernetes.io/projected/8a2523fe-5501-417e-9d8b-85e936ed840c-kube-api-access-m9ndf\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.443327 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxdsw"] Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.574271 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.611969 4707 generic.go:334] "Generic (PLEG): container finished" podID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerID="9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1" exitCode=0 Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.612094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ljzgq" event={"ID":"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d","Type":"ContainerDied","Data":"9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1"} Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.612153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ljzgq" event={"ID":"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d","Type":"ContainerStarted","Data":"6e663d2a1ba47e2d37e06136a3bb0bee05a068c31de74a5f7813cd78cac3a0df"} Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.627504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff","Type":"ContainerStarted","Data":"ff28d816165e561dc290ff0e54c58f077ab8f7bb3dbe15c852fb7f01b89abfbf"} Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.634709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" event={"ID":"8a2523fe-5501-417e-9d8b-85e936ed840c","Type":"ContainerDied","Data":"66cda181331724c61b7b6ac6d67aa4808c1e67404a247bd02e6519ebfb0d7c38"} Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.634755 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66cda181331724c61b7b6ac6d67aa4808c1e67404a247bd02e6519ebfb0d7c38" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.634861 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.646569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxdsw" event={"ID":"c814051a-bbf1-4219-8089-8124cb1d3b7b","Type":"ContainerStarted","Data":"1252384938d33057344238dc2bb7a6b8f818e385184fa2732bd805ed778fe8ee"} Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.667953 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.667926243 podStartE2EDuration="2.667926243s" podCreationTimestamp="2026-01-29 03:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:30:00.664095398 +0000 UTC m=+154.148324303" watchObservedRunningTime="2026-01-29 03:30:00.667926243 +0000 UTC m=+154.152155148" Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.680063 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerID="104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c" exitCode=0 Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.681091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hb89x" event={"ID":"ad1e990a-db38-4eb8-8ae9-6bd700728e48","Type":"ContainerDied","Data":"104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c"} Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.692646 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh"] Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.704762 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494275-xzvdh"] Jan 29 03:30:00 crc kubenswrapper[4707]: I0129 03:30:00.724357 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gcm8f"] Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.239339 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 29 03:30:01 crc kubenswrapper[4707]: E0129 03:30:01.240093 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2523fe-5501-417e-9d8b-85e936ed840c" containerName="collect-profiles" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.240106 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2523fe-5501-417e-9d8b-85e936ed840c" containerName="collect-profiles" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.240202 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2523fe-5501-417e-9d8b-85e936ed840c" containerName="collect-profiles" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.240639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.243503 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.244892 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.276730 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2523fe-5501-417e-9d8b-85e936ed840c" path="/var/lib/kubelet/pods/8a2523fe-5501-417e-9d8b-85e936ed840c/volumes" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.277746 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.288687 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp"] Jan 29 03:30:01 crc kubenswrapper[4707]: W0129 03:30:01.317941 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e997ce0_2a4c_4d1c_9191_4de1ab444f09.slice/crio-e92ab4933b51f08b1e42b9f6567ab07d308c6cba14b6d80f92a8c671b30ad1af WatchSource:0}: Error finding container e92ab4933b51f08b1e42b9f6567ab07d308c6cba14b6d80f92a8c671b30ad1af: Status 404 returned error can't find the container with id e92ab4933b51f08b1e42b9f6567ab07d308c6cba14b6d80f92a8c671b30ad1af Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.362452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.362550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.464306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.464354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.464679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.489183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.597347 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.723858 4707 generic.go:334] "Generic (PLEG): container finished" podID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerID="8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49" exitCode=0 Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.724006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxdsw" event={"ID":"c814051a-bbf1-4219-8089-8124cb1d3b7b","Type":"ContainerDied","Data":"8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49"} Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.734439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" event={"ID":"8e997ce0-2a4c-4d1c-9191-4de1ab444f09","Type":"ContainerStarted","Data":"e92ab4933b51f08b1e42b9f6567ab07d308c6cba14b6d80f92a8c671b30ad1af"} Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.738131 4707 generic.go:334] "Generic (PLEG): container finished" podID="97319f63-bd16-4f50-b2d3-65340867a92a" containerID="d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc" exitCode=0 Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.738200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcm8f" event={"ID":"97319f63-bd16-4f50-b2d3-65340867a92a","Type":"ContainerDied","Data":"d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc"} Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.738218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcm8f" event={"ID":"97319f63-bd16-4f50-b2d3-65340867a92a","Type":"ContainerStarted","Data":"5a603d8fd94f1f43e807d7fe9ab0ea6172f6d18401e2072a90c8e74d633c179f"} Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.740410 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b23b73c-579d-4e4b-bf91-0f60fc5f7aff" containerID="ff28d816165e561dc290ff0e54c58f077ab8f7bb3dbe15c852fb7f01b89abfbf" exitCode=0 Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.741668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff","Type":"ContainerDied","Data":"ff28d816165e561dc290ff0e54c58f077ab8f7bb3dbe15c852fb7f01b89abfbf"} Jan 29 03:30:01 crc kubenswrapper[4707]: I0129 03:30:01.966775 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 29 03:30:02 crc kubenswrapper[4707]: I0129 03:30:02.768051 4707 generic.go:334] "Generic (PLEG): container finished" podID="8e997ce0-2a4c-4d1c-9191-4de1ab444f09" containerID="6f377662679e8bc9b2f0eab54454145e3141084ebda1460ccef9a3576e7b597f" exitCode=0 Jan 29 03:30:02 crc kubenswrapper[4707]: I0129 03:30:02.768368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" event={"ID":"8e997ce0-2a4c-4d1c-9191-4de1ab444f09","Type":"ContainerDied","Data":"6f377662679e8bc9b2f0eab54454145e3141084ebda1460ccef9a3576e7b597f"} Jan 29 03:30:02 crc kubenswrapper[4707]: I0129 03:30:02.772068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ad1892a6-e93d-4323-81b3-4e46ad5f0310","Type":"ContainerStarted","Data":"b6102d6c2eb53dfaa622bc8b867bf8bea8957a389b1f4133e91b5827fa611c8c"} Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.156597 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.308791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kube-api-access\") pod \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\" (UID: \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\") " Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.308852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kubelet-dir\") pod \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\" (UID: \"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff\") " Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.309053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4b23b73c-579d-4e4b-bf91-0f60fc5f7aff" (UID: "4b23b73c-579d-4e4b-bf91-0f60fc5f7aff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.309911 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.320957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4b23b73c-579d-4e4b-bf91-0f60fc5f7aff" (UID: "4b23b73c-579d-4e4b-bf91-0f60fc5f7aff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.411835 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b23b73c-579d-4e4b-bf91-0f60fc5f7aff-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.462739 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.462798 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.797906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4b23b73c-579d-4e4b-bf91-0f60fc5f7aff","Type":"ContainerDied","Data":"02afc1edce2824b28d720607b08b7afa07b403caa5cbfb17989e1376e3c6df84"} Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.797974 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02afc1edce2824b28d720607b08b7afa07b403caa5cbfb17989e1376e3c6df84" Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.798101 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.801028 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad1892a6-e93d-4323-81b3-4e46ad5f0310" containerID="85ab859148a51ca12a8598e188ae58c98c51a2e174b57e5c9350a8cb8a71e1f8" exitCode=0 Jan 29 03:30:03 crc kubenswrapper[4707]: I0129 03:30:03.801489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ad1892a6-e93d-4323-81b3-4e46ad5f0310","Type":"ContainerDied","Data":"85ab859148a51ca12a8598e188ae58c98c51a2e174b57e5c9350a8cb8a71e1f8"} Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.128629 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sfkbk" Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.207186 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.343906 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-secret-volume\") pod \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.343985 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs4bc\" (UniqueName: \"kubernetes.io/projected/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-kube-api-access-zs4bc\") pod \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.344164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-config-volume\") pod \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\" (UID: \"8e997ce0-2a4c-4d1c-9191-4de1ab444f09\") " Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.347641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-config-volume" (OuterVolumeSpecName: "config-volume") pod "8e997ce0-2a4c-4d1c-9191-4de1ab444f09" (UID: "8e997ce0-2a4c-4d1c-9191-4de1ab444f09"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.352281 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-kube-api-access-zs4bc" (OuterVolumeSpecName: "kube-api-access-zs4bc") pod "8e997ce0-2a4c-4d1c-9191-4de1ab444f09" (UID: "8e997ce0-2a4c-4d1c-9191-4de1ab444f09"). InnerVolumeSpecName "kube-api-access-zs4bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.365167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8e997ce0-2a4c-4d1c-9191-4de1ab444f09" (UID: "8e997ce0-2a4c-4d1c-9191-4de1ab444f09"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.446271 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.446327 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs4bc\" (UniqueName: \"kubernetes.io/projected/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-kube-api-access-zs4bc\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.446338 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e997ce0-2a4c-4d1c-9191-4de1ab444f09-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.826474 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.826496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp" event={"ID":"8e997ce0-2a4c-4d1c-9191-4de1ab444f09","Type":"ContainerDied","Data":"e92ab4933b51f08b1e42b9f6567ab07d308c6cba14b6d80f92a8c671b30ad1af"} Jan 29 03:30:04 crc kubenswrapper[4707]: I0129 03:30:04.827449 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e92ab4933b51f08b1e42b9f6567ab07d308c6cba14b6d80f92a8c671b30ad1af" Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.303325 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.470824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kubelet-dir\") pod \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\" (UID: \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\") " Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.470982 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ad1892a6-e93d-4323-81b3-4e46ad5f0310" (UID: "ad1892a6-e93d-4323-81b3-4e46ad5f0310"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.471015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kube-api-access\") pod \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\" (UID: \"ad1892a6-e93d-4323-81b3-4e46ad5f0310\") " Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.471329 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.478493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ad1892a6-e93d-4323-81b3-4e46ad5f0310" (UID: "ad1892a6-e93d-4323-81b3-4e46ad5f0310"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.573287 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad1892a6-e93d-4323-81b3-4e46ad5f0310-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.896856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ad1892a6-e93d-4323-81b3-4e46ad5f0310","Type":"ContainerDied","Data":"b6102d6c2eb53dfaa622bc8b867bf8bea8957a389b1f4133e91b5827fa611c8c"} Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.896917 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6102d6c2eb53dfaa622bc8b867bf8bea8957a389b1f4133e91b5827fa611c8c" Jan 29 03:30:05 crc kubenswrapper[4707]: I0129 03:30:05.897029 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 03:30:08 crc kubenswrapper[4707]: I0129 03:30:08.324396 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-snpzw" Jan 29 03:30:08 crc kubenswrapper[4707]: I0129 03:30:08.761726 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:30:08 crc kubenswrapper[4707]: I0129 03:30:08.766008 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:30:15 crc kubenswrapper[4707]: I0129 03:30:15.268366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:30:15 crc kubenswrapper[4707]: I0129 03:30:15.278614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08dd724c-b8cc-45c6-9a61-13643a1c0d75-metrics-certs\") pod \"network-metrics-daemon-652c6\" (UID: \"08dd724c-b8cc-45c6-9a61-13643a1c0d75\") " pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:30:15 crc kubenswrapper[4707]: I0129 03:30:15.472530 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-652c6" Jan 29 03:30:17 crc kubenswrapper[4707]: I0129 03:30:17.927090 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:30:28 crc kubenswrapper[4707]: I0129 03:30:28.728668 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f785m" Jan 29 03:30:29 crc kubenswrapper[4707]: E0129 03:30:29.156665 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 03:30:29 crc kubenswrapper[4707]: E0129 03:30:29.157651 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llwfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nhjg7_openshift-marketplace(a832dac2-976f-45e7-adc9-fc29666d0721): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 03:30:29 crc kubenswrapper[4707]: E0129 03:30:29.159067 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nhjg7" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" Jan 29 03:30:32 crc kubenswrapper[4707]: E0129 03:30:32.611422 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nhjg7" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" Jan 29 03:30:32 crc kubenswrapper[4707]: E0129 03:30:32.710999 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 03:30:32 crc kubenswrapper[4707]: E0129 03:30:32.711270 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4v2zk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wxdsw_openshift-marketplace(c814051a-bbf1-4219-8089-8124cb1d3b7b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 03:30:32 crc kubenswrapper[4707]: E0129 03:30:32.712514 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 03:30:32 crc kubenswrapper[4707]: E0129 03:30:32.712511 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wxdsw" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" Jan 29 03:30:32 crc kubenswrapper[4707]: E0129 03:30:32.712813 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hc7k5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ss5sc_openshift-marketplace(c24271ec-27aa-4e94-8244-9b05496687ee): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 03:30:32 crc kubenswrapper[4707]: E0129 03:30:32.714082 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ss5sc" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" Jan 29 03:30:33 crc kubenswrapper[4707]: I0129 03:30:33.464385 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:30:33 crc kubenswrapper[4707]: I0129 03:30:33.465000 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:30:34 crc kubenswrapper[4707]: E0129 03:30:34.172869 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wxdsw" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" Jan 29 03:30:34 crc kubenswrapper[4707]: E0129 03:30:34.172912 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ss5sc" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" Jan 29 03:30:34 crc kubenswrapper[4707]: I0129 03:30:34.365611 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-652c6"] Jan 29 03:30:34 crc kubenswrapper[4707]: I0129 03:30:34.675967 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.729206 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.729401 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmxt7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hb89x_openshift-marketplace(ad1e990a-db38-4eb8-8ae9-6bd700728e48): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.730699 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hb89x" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.784597 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.784933 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9vv8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ljzgq_openshift-marketplace(231f9fe1-a85f-4ac8-928d-5e76e3b45c9d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.786230 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ljzgq" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.791374 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.791728 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ppvl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gcm8f_openshift-marketplace(97319f63-bd16-4f50-b2d3-65340867a92a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.793067 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gcm8f" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.797861 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.798188 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m4hxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bn9xb_openshift-marketplace(b8e6b828-3c69-4f2d-95f6-2c1f6294a17e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 03:30:35 crc kubenswrapper[4707]: E0129 03:30:35.799407 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bn9xb" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" Jan 29 03:30:36 crc kubenswrapper[4707]: I0129 03:30:36.150595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrvfw" event={"ID":"b4d4ff70-611c-4a65-982c-f551baa66bd5","Type":"ContainerStarted","Data":"da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60"} Jan 29 03:30:36 crc kubenswrapper[4707]: I0129 03:30:36.152470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-652c6" event={"ID":"08dd724c-b8cc-45c6-9a61-13643a1c0d75","Type":"ContainerStarted","Data":"b5f280d1fb0f22827321ea0b7fbd29ebc4c4a8691ca57436be1758c49f055d6e"} Jan 29 03:30:36 crc kubenswrapper[4707]: I0129 03:30:36.152504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-652c6" event={"ID":"08dd724c-b8cc-45c6-9a61-13643a1c0d75","Type":"ContainerStarted","Data":"60b36f1ab773f57f330d0faf6f332dd1c4acc0da878086c6c52d3fd1e1ad7f9c"} Jan 29 03:30:36 crc kubenswrapper[4707]: E0129 03:30:36.154681 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ljzgq" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" Jan 29 03:30:36 crc kubenswrapper[4707]: E0129 03:30:36.155184 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hb89x" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" Jan 29 03:30:36 crc kubenswrapper[4707]: E0129 03:30:36.158490 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bn9xb" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" Jan 29 03:30:37 crc kubenswrapper[4707]: I0129 03:30:37.160661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-652c6" event={"ID":"08dd724c-b8cc-45c6-9a61-13643a1c0d75","Type":"ContainerStarted","Data":"12acf48aa82ef571c8858abba2aec9f6f44aaeee882bfe62c82fd0af0280d896"} Jan 29 03:30:37 crc kubenswrapper[4707]: I0129 03:30:37.163299 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerID="da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60" exitCode=0 Jan 29 03:30:37 crc kubenswrapper[4707]: I0129 03:30:37.163375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrvfw" event={"ID":"b4d4ff70-611c-4a65-982c-f551baa66bd5","Type":"ContainerDied","Data":"da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60"} Jan 29 03:30:37 crc kubenswrapper[4707]: I0129 03:30:37.218812 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-652c6" podStartSLOduration=165.218778942 podStartE2EDuration="2m45.218778942s" podCreationTimestamp="2026-01-29 03:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:30:37.192572765 +0000 UTC m=+190.676801700" watchObservedRunningTime="2026-01-29 03:30:37.218778942 +0000 UTC m=+190.703007887" Jan 29 03:30:38 crc kubenswrapper[4707]: I0129 03:30:38.176530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrvfw" event={"ID":"b4d4ff70-611c-4a65-982c-f551baa66bd5","Type":"ContainerStarted","Data":"399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b"} Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.036627 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xrvfw" podStartSLOduration=4.85160234 podStartE2EDuration="44.036611306s" podCreationTimestamp="2026-01-29 03:29:56 +0000 UTC" firstStartedPulling="2026-01-29 03:29:58.400591432 +0000 UTC m=+151.884820337" lastFinishedPulling="2026-01-29 03:30:37.585600398 +0000 UTC m=+191.069829303" observedRunningTime="2026-01-29 03:30:38.204969429 +0000 UTC m=+191.689198374" watchObservedRunningTime="2026-01-29 03:30:40.036611306 +0000 UTC m=+193.520840211" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.037109 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 29 03:30:40 crc kubenswrapper[4707]: E0129 03:30:40.037335 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e997ce0-2a4c-4d1c-9191-4de1ab444f09" containerName="collect-profiles" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.037347 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e997ce0-2a4c-4d1c-9191-4de1ab444f09" containerName="collect-profiles" Jan 29 03:30:40 crc kubenswrapper[4707]: E0129 03:30:40.037357 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1892a6-e93d-4323-81b3-4e46ad5f0310" containerName="pruner" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.037364 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1892a6-e93d-4323-81b3-4e46ad5f0310" containerName="pruner" Jan 29 03:30:40 crc kubenswrapper[4707]: E0129 03:30:40.037382 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b23b73c-579d-4e4b-bf91-0f60fc5f7aff" containerName="pruner" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.037388 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b23b73c-579d-4e4b-bf91-0f60fc5f7aff" containerName="pruner" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.037488 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b23b73c-579d-4e4b-bf91-0f60fc5f7aff" containerName="pruner" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.037501 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e997ce0-2a4c-4d1c-9191-4de1ab444f09" containerName="collect-profiles" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.037508 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1892a6-e93d-4323-81b3-4e46ad5f0310" containerName="pruner" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.037923 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.040449 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.040790 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.047514 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.200721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65fbb600-7bc1-4a27-b598-3560b5008524-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"65fbb600-7bc1-4a27-b598-3560b5008524\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.200834 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65fbb600-7bc1-4a27-b598-3560b5008524-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"65fbb600-7bc1-4a27-b598-3560b5008524\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.302556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65fbb600-7bc1-4a27-b598-3560b5008524-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"65fbb600-7bc1-4a27-b598-3560b5008524\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.302650 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65fbb600-7bc1-4a27-b598-3560b5008524-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"65fbb600-7bc1-4a27-b598-3560b5008524\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.302889 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65fbb600-7bc1-4a27-b598-3560b5008524-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"65fbb600-7bc1-4a27-b598-3560b5008524\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.331666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65fbb600-7bc1-4a27-b598-3560b5008524-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"65fbb600-7bc1-4a27-b598-3560b5008524\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.362748 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:40 crc kubenswrapper[4707]: I0129 03:30:40.563702 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 29 03:30:41 crc kubenswrapper[4707]: I0129 03:30:41.194394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"65fbb600-7bc1-4a27-b598-3560b5008524","Type":"ContainerStarted","Data":"a0dca30931221370126446928a4016227e2dcb028bb940665bfc61ff557dec91"} Jan 29 03:30:41 crc kubenswrapper[4707]: I0129 03:30:41.194802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"65fbb600-7bc1-4a27-b598-3560b5008524","Type":"ContainerStarted","Data":"f0d57441a06d5b0759d4f0699bc3b476093bb40ca9063ec9354f3207e0b05179"} Jan 29 03:30:41 crc kubenswrapper[4707]: I0129 03:30:41.213235 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.21321555 podStartE2EDuration="1.21321555s" podCreationTimestamp="2026-01-29 03:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:30:41.207421355 +0000 UTC m=+194.691650260" watchObservedRunningTime="2026-01-29 03:30:41.21321555 +0000 UTC m=+194.697444455" Jan 29 03:30:42 crc kubenswrapper[4707]: I0129 03:30:42.201853 4707 generic.go:334] "Generic (PLEG): container finished" podID="65fbb600-7bc1-4a27-b598-3560b5008524" containerID="a0dca30931221370126446928a4016227e2dcb028bb940665bfc61ff557dec91" exitCode=0 Jan 29 03:30:42 crc kubenswrapper[4707]: I0129 03:30:42.202027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"65fbb600-7bc1-4a27-b598-3560b5008524","Type":"ContainerDied","Data":"a0dca30931221370126446928a4016227e2dcb028bb940665bfc61ff557dec91"} Jan 29 03:30:43 crc kubenswrapper[4707]: I0129 03:30:43.410874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:43 crc kubenswrapper[4707]: I0129 03:30:43.550168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65fbb600-7bc1-4a27-b598-3560b5008524-kube-api-access\") pod \"65fbb600-7bc1-4a27-b598-3560b5008524\" (UID: \"65fbb600-7bc1-4a27-b598-3560b5008524\") " Jan 29 03:30:43 crc kubenswrapper[4707]: I0129 03:30:43.550309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65fbb600-7bc1-4a27-b598-3560b5008524-kubelet-dir\") pod \"65fbb600-7bc1-4a27-b598-3560b5008524\" (UID: \"65fbb600-7bc1-4a27-b598-3560b5008524\") " Jan 29 03:30:43 crc kubenswrapper[4707]: I0129 03:30:43.550497 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65fbb600-7bc1-4a27-b598-3560b5008524-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "65fbb600-7bc1-4a27-b598-3560b5008524" (UID: "65fbb600-7bc1-4a27-b598-3560b5008524"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:30:43 crc kubenswrapper[4707]: I0129 03:30:43.550893 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65fbb600-7bc1-4a27-b598-3560b5008524-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:43 crc kubenswrapper[4707]: I0129 03:30:43.556773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fbb600-7bc1-4a27-b598-3560b5008524-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "65fbb600-7bc1-4a27-b598-3560b5008524" (UID: "65fbb600-7bc1-4a27-b598-3560b5008524"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:30:43 crc kubenswrapper[4707]: I0129 03:30:43.652784 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65fbb600-7bc1-4a27-b598-3560b5008524-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 03:30:44 crc kubenswrapper[4707]: I0129 03:30:44.217713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"65fbb600-7bc1-4a27-b598-3560b5008524","Type":"ContainerDied","Data":"f0d57441a06d5b0759d4f0699bc3b476093bb40ca9063ec9354f3207e0b05179"} Jan 29 03:30:44 crc kubenswrapper[4707]: I0129 03:30:44.217786 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0d57441a06d5b0759d4f0699bc3b476093bb40ca9063ec9354f3207e0b05179" Jan 29 03:30:44 crc kubenswrapper[4707]: I0129 03:30:44.217805 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 03:30:46 crc kubenswrapper[4707]: I0129 03:30:46.660441 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:30:46 crc kubenswrapper[4707]: I0129 03:30:46.660980 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:30:46 crc kubenswrapper[4707]: I0129 03:30:46.817775 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.291409 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.436728 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 29 03:30:47 crc kubenswrapper[4707]: E0129 03:30:47.437130 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fbb600-7bc1-4a27-b598-3560b5008524" containerName="pruner" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.437150 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fbb600-7bc1-4a27-b598-3560b5008524" containerName="pruner" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.437346 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="65fbb600-7bc1-4a27-b598-3560b5008524" containerName="pruner" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.437987 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.442668 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.443139 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.456392 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.515931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.516025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-var-lock\") pod \"installer-9-crc\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.516066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.617678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.617791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.617842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.617903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-var-lock\") pod \"installer-9-crc\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.618077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-var-lock\") pod \"installer-9-crc\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.640241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:47 crc kubenswrapper[4707]: I0129 03:30:47.758578 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:30:48 crc kubenswrapper[4707]: I0129 03:30:48.444076 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 29 03:30:49 crc kubenswrapper[4707]: I0129 03:30:49.277071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxdsw" event={"ID":"c814051a-bbf1-4219-8089-8124cb1d3b7b","Type":"ContainerStarted","Data":"1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d"} Jan 29 03:30:49 crc kubenswrapper[4707]: I0129 03:30:49.283126 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerID="eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db" exitCode=0 Jan 29 03:30:49 crc kubenswrapper[4707]: I0129 03:30:49.283214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hb89x" event={"ID":"ad1e990a-db38-4eb8-8ae9-6bd700728e48","Type":"ContainerDied","Data":"eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db"} Jan 29 03:30:49 crc kubenswrapper[4707]: I0129 03:30:49.285032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa","Type":"ContainerStarted","Data":"f3484a35051274a01645ac1ac47bd2267a67f718eb2835bd3d48f1b2b6bb3a97"} Jan 29 03:30:49 crc kubenswrapper[4707]: I0129 03:30:49.285069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa","Type":"ContainerStarted","Data":"31ef6bda8ae4030fea8779581ffb0aa8bc93d59a2462b312417e0fe568a92dcc"} Jan 29 03:30:49 crc kubenswrapper[4707]: I0129 03:30:49.334258 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.334239347 podStartE2EDuration="2.334239347s" podCreationTimestamp="2026-01-29 03:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:30:49.332002362 +0000 UTC m=+202.816231277" watchObservedRunningTime="2026-01-29 03:30:49.334239347 +0000 UTC m=+202.818468252" Jan 29 03:30:50 crc kubenswrapper[4707]: I0129 03:30:50.294928 4707 generic.go:334] "Generic (PLEG): container finished" podID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerID="1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d" exitCode=0 Jan 29 03:30:50 crc kubenswrapper[4707]: I0129 03:30:50.294984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxdsw" event={"ID":"c814051a-bbf1-4219-8089-8124cb1d3b7b","Type":"ContainerDied","Data":"1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d"} Jan 29 03:30:58 crc kubenswrapper[4707]: I0129 03:30:58.351561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hb89x" event={"ID":"ad1e990a-db38-4eb8-8ae9-6bd700728e48","Type":"ContainerStarted","Data":"c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a"} Jan 29 03:30:58 crc kubenswrapper[4707]: I0129 03:30:58.373681 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hb89x" podStartSLOduration=3.008016911 podStartE2EDuration="1m0.373645456s" podCreationTimestamp="2026-01-29 03:29:58 +0000 UTC" firstStartedPulling="2026-01-29 03:30:00.744952265 +0000 UTC m=+154.229181170" lastFinishedPulling="2026-01-29 03:30:58.1105808 +0000 UTC m=+211.594809715" observedRunningTime="2026-01-29 03:30:58.37103277 +0000 UTC m=+211.855261675" watchObservedRunningTime="2026-01-29 03:30:58.373645456 +0000 UTC m=+211.857874361" Jan 29 03:30:58 crc kubenswrapper[4707]: I0129 03:30:58.609688 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:30:58 crc kubenswrapper[4707]: I0129 03:30:58.609757 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.360293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxdsw" event={"ID":"c814051a-bbf1-4219-8089-8124cb1d3b7b","Type":"ContainerStarted","Data":"67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6"} Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.364207 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerID="489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e" exitCode=0 Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.364328 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bn9xb" event={"ID":"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e","Type":"ContainerDied","Data":"489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e"} Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.366730 4707 generic.go:334] "Generic (PLEG): container finished" podID="97319f63-bd16-4f50-b2d3-65340867a92a" containerID="645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003" exitCode=0 Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.366800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcm8f" event={"ID":"97319f63-bd16-4f50-b2d3-65340867a92a","Type":"ContainerDied","Data":"645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003"} Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.374137 4707 generic.go:334] "Generic (PLEG): container finished" podID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerID="d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2" exitCode=0 Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.374252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ljzgq" event={"ID":"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d","Type":"ContainerDied","Data":"d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2"} Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.378067 4707 generic.go:334] "Generic (PLEG): container finished" podID="a832dac2-976f-45e7-adc9-fc29666d0721" containerID="5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59" exitCode=0 Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.378155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjg7" event={"ID":"a832dac2-976f-45e7-adc9-fc29666d0721","Type":"ContainerDied","Data":"5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59"} Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.380708 4707 generic.go:334] "Generic (PLEG): container finished" podID="c24271ec-27aa-4e94-8244-9b05496687ee" containerID="f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335" exitCode=0 Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.380964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ss5sc" event={"ID":"c24271ec-27aa-4e94-8244-9b05496687ee","Type":"ContainerDied","Data":"f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335"} Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.392014 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wxdsw" podStartSLOduration=3.926081641 podStartE2EDuration="1m0.391975272s" podCreationTimestamp="2026-01-29 03:29:59 +0000 UTC" firstStartedPulling="2026-01-29 03:30:01.72807613 +0000 UTC m=+155.212305035" lastFinishedPulling="2026-01-29 03:30:58.193969751 +0000 UTC m=+211.678198666" observedRunningTime="2026-01-29 03:30:59.388894273 +0000 UTC m=+212.873123198" watchObservedRunningTime="2026-01-29 03:30:59.391975272 +0000 UTC m=+212.876204197" Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.650025 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.650072 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:30:59 crc kubenswrapper[4707]: I0129 03:30:59.658340 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-hb89x" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerName="registry-server" probeResult="failure" output=< Jan 29 03:30:59 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 03:30:59 crc kubenswrapper[4707]: > Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.389604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bn9xb" event={"ID":"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e","Type":"ContainerStarted","Data":"5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9"} Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.392021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcm8f" event={"ID":"97319f63-bd16-4f50-b2d3-65340867a92a","Type":"ContainerStarted","Data":"cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5"} Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.394512 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ljzgq" event={"ID":"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d","Type":"ContainerStarted","Data":"5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f"} Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.397167 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjg7" event={"ID":"a832dac2-976f-45e7-adc9-fc29666d0721","Type":"ContainerStarted","Data":"51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473"} Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.400259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ss5sc" event={"ID":"c24271ec-27aa-4e94-8244-9b05496687ee","Type":"ContainerStarted","Data":"b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f"} Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.414241 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bn9xb" podStartSLOduration=2.912460798 podStartE2EDuration="1m4.414217361s" podCreationTimestamp="2026-01-29 03:29:56 +0000 UTC" firstStartedPulling="2026-01-29 03:29:58.437088918 +0000 UTC m=+151.921317823" lastFinishedPulling="2026-01-29 03:30:59.938845481 +0000 UTC m=+213.423074386" observedRunningTime="2026-01-29 03:31:00.41346134 +0000 UTC m=+213.897690265" watchObservedRunningTime="2026-01-29 03:31:00.414217361 +0000 UTC m=+213.898446276" Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.439652 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ss5sc" podStartSLOduration=3.076845396 podStartE2EDuration="1m4.439630913s" podCreationTimestamp="2026-01-29 03:29:56 +0000 UTC" firstStartedPulling="2026-01-29 03:29:58.505188033 +0000 UTC m=+151.989416938" lastFinishedPulling="2026-01-29 03:30:59.86797355 +0000 UTC m=+213.352202455" observedRunningTime="2026-01-29 03:31:00.436272937 +0000 UTC m=+213.920501862" watchObservedRunningTime="2026-01-29 03:31:00.439630913 +0000 UTC m=+213.923859838" Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.459221 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nhjg7" podStartSLOduration=1.83725224 podStartE2EDuration="1m4.459193347s" podCreationTimestamp="2026-01-29 03:29:56 +0000 UTC" firstStartedPulling="2026-01-29 03:29:57.322941068 +0000 UTC m=+150.807169973" lastFinishedPulling="2026-01-29 03:30:59.944882175 +0000 UTC m=+213.429111080" observedRunningTime="2026-01-29 03:31:00.458104025 +0000 UTC m=+213.942332950" watchObservedRunningTime="2026-01-29 03:31:00.459193347 +0000 UTC m=+213.943422252" Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.486625 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ljzgq" podStartSLOduration=3.403130321 podStartE2EDuration="1m2.486601356s" podCreationTimestamp="2026-01-29 03:29:58 +0000 UTC" firstStartedPulling="2026-01-29 03:30:00.745645506 +0000 UTC m=+154.229874401" lastFinishedPulling="2026-01-29 03:30:59.829116531 +0000 UTC m=+213.313345436" observedRunningTime="2026-01-29 03:31:00.478913284 +0000 UTC m=+213.963142199" watchObservedRunningTime="2026-01-29 03:31:00.486601356 +0000 UTC m=+213.970830261" Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.502341 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gcm8f" podStartSLOduration=4.552977118 podStartE2EDuration="1m1.502312638s" podCreationTimestamp="2026-01-29 03:29:59 +0000 UTC" firstStartedPulling="2026-01-29 03:30:02.777510926 +0000 UTC m=+156.261739831" lastFinishedPulling="2026-01-29 03:30:59.726846446 +0000 UTC m=+213.211075351" observedRunningTime="2026-01-29 03:31:00.500093294 +0000 UTC m=+213.984322209" watchObservedRunningTime="2026-01-29 03:31:00.502312638 +0000 UTC m=+213.986541543" Jan 29 03:31:00 crc kubenswrapper[4707]: I0129 03:31:00.691502 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wxdsw" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerName="registry-server" probeResult="failure" output=< Jan 29 03:31:00 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 03:31:00 crc kubenswrapper[4707]: > Jan 29 03:31:03 crc kubenswrapper[4707]: I0129 03:31:03.463117 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:31:03 crc kubenswrapper[4707]: I0129 03:31:03.463214 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:31:03 crc kubenswrapper[4707]: I0129 03:31:03.463894 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:31:03 crc kubenswrapper[4707]: I0129 03:31:03.465016 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 03:31:03 crc kubenswrapper[4707]: I0129 03:31:03.465249 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93" gracePeriod=600 Jan 29 03:31:04 crc kubenswrapper[4707]: I0129 03:31:04.425449 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93" exitCode=0 Jan 29 03:31:04 crc kubenswrapper[4707]: I0129 03:31:04.425583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93"} Jan 29 03:31:04 crc kubenswrapper[4707]: I0129 03:31:04.425832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"2018b8d36afa3f2a5c920f93a22bd21150b05028d7be0b59b8d8babfd9ed3779"} Jan 29 03:31:06 crc kubenswrapper[4707]: I0129 03:31:06.494570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:31:06 crc kubenswrapper[4707]: I0129 03:31:06.495108 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:31:06 crc kubenswrapper[4707]: I0129 03:31:06.535423 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:31:06 crc kubenswrapper[4707]: I0129 03:31:06.954192 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:31:06 crc kubenswrapper[4707]: I0129 03:31:06.954239 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:31:07 crc kubenswrapper[4707]: I0129 03:31:07.031202 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:31:07 crc kubenswrapper[4707]: I0129 03:31:07.130374 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:31:07 crc kubenswrapper[4707]: I0129 03:31:07.130428 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:31:07 crc kubenswrapper[4707]: I0129 03:31:07.169195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:31:07 crc kubenswrapper[4707]: I0129 03:31:07.406595 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k6sdj"] Jan 29 03:31:07 crc kubenswrapper[4707]: I0129 03:31:07.501896 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:31:07 crc kubenswrapper[4707]: I0129 03:31:07.517430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:31:07 crc kubenswrapper[4707]: I0129 03:31:07.536219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:31:08 crc kubenswrapper[4707]: I0129 03:31:08.661443 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:31:08 crc kubenswrapper[4707]: I0129 03:31:08.714248 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.071504 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.072630 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.115937 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.275571 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bn9xb"] Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.480347 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bn9xb" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerName="registry-server" containerID="cri-o://5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9" gracePeriod=2 Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.548195 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.701267 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.754725 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.892657 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ss5sc"] Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.893345 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ss5sc" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" containerName="registry-server" containerID="cri-o://b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f" gracePeriod=2 Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.933009 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:31:09 crc kubenswrapper[4707]: E0129 03:31:09.945378 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24271ec_27aa_4e94_8244_9b05496687ee.slice/crio-b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f.scope\": RecentStats: unable to find data in memory cache]" Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.982966 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4hxl\" (UniqueName: \"kubernetes.io/projected/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-kube-api-access-m4hxl\") pod \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.983037 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-catalog-content\") pod \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.983172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-utilities\") pod \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\" (UID: \"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e\") " Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.984980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-utilities" (OuterVolumeSpecName: "utilities") pod "b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" (UID: "b8e6b828-3c69-4f2d-95f6-2c1f6294a17e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:31:09 crc kubenswrapper[4707]: I0129 03:31:09.995038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-kube-api-access-m4hxl" (OuterVolumeSpecName: "kube-api-access-m4hxl") pod "b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" (UID: "b8e6b828-3c69-4f2d-95f6-2c1f6294a17e"). InnerVolumeSpecName "kube-api-access-m4hxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.058808 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" (UID: "b8e6b828-3c69-4f2d-95f6-2c1f6294a17e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.086134 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4hxl\" (UniqueName: \"kubernetes.io/projected/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-kube-api-access-m4hxl\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.086175 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.086189 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.111589 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.111652 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.163493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.212968 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.288874 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-utilities\") pod \"c24271ec-27aa-4e94-8244-9b05496687ee\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.288996 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-catalog-content\") pod \"c24271ec-27aa-4e94-8244-9b05496687ee\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.289085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc7k5\" (UniqueName: \"kubernetes.io/projected/c24271ec-27aa-4e94-8244-9b05496687ee-kube-api-access-hc7k5\") pod \"c24271ec-27aa-4e94-8244-9b05496687ee\" (UID: \"c24271ec-27aa-4e94-8244-9b05496687ee\") " Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.289871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-utilities" (OuterVolumeSpecName: "utilities") pod "c24271ec-27aa-4e94-8244-9b05496687ee" (UID: "c24271ec-27aa-4e94-8244-9b05496687ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.297405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24271ec-27aa-4e94-8244-9b05496687ee-kube-api-access-hc7k5" (OuterVolumeSpecName: "kube-api-access-hc7k5") pod "c24271ec-27aa-4e94-8244-9b05496687ee" (UID: "c24271ec-27aa-4e94-8244-9b05496687ee"). InnerVolumeSpecName "kube-api-access-hc7k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.391156 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc7k5\" (UniqueName: \"kubernetes.io/projected/c24271ec-27aa-4e94-8244-9b05496687ee-kube-api-access-hc7k5\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.391194 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.491415 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerID="5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9" exitCode=0 Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.491531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bn9xb" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.491675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bn9xb" event={"ID":"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e","Type":"ContainerDied","Data":"5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9"} Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.492258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bn9xb" event={"ID":"b8e6b828-3c69-4f2d-95f6-2c1f6294a17e","Type":"ContainerDied","Data":"3e147951199e2f5e859feb739713c30e574ec20201de21da875f1735f63a9379"} Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.492396 4707 scope.go:117] "RemoveContainer" containerID="5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.496443 4707 generic.go:334] "Generic (PLEG): container finished" podID="c24271ec-27aa-4e94-8244-9b05496687ee" containerID="b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f" exitCode=0 Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.496616 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ss5sc" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.497061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ss5sc" event={"ID":"c24271ec-27aa-4e94-8244-9b05496687ee","Type":"ContainerDied","Data":"b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f"} Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.497150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ss5sc" event={"ID":"c24271ec-27aa-4e94-8244-9b05496687ee","Type":"ContainerDied","Data":"755e939f4c507d4999a21c7ac8cc0bf2ceb3c710d02e8e8a2ae22c3d414b60a5"} Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.527622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c24271ec-27aa-4e94-8244-9b05496687ee" (UID: "c24271ec-27aa-4e94-8244-9b05496687ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.534523 4707 scope.go:117] "RemoveContainer" containerID="489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.550186 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bn9xb"] Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.556141 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bn9xb"] Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.569992 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.572451 4707 scope.go:117] "RemoveContainer" containerID="a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.594146 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24271ec-27aa-4e94-8244-9b05496687ee-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.594216 4707 scope.go:117] "RemoveContainer" containerID="5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9" Jan 29 03:31:10 crc kubenswrapper[4707]: E0129 03:31:10.595063 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9\": container with ID starting with 5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9 not found: ID does not exist" containerID="5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.595105 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9"} err="failed to get container status \"5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9\": rpc error: code = NotFound desc = could not find container \"5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9\": container with ID starting with 5205b458cedba65cff3afcccba2acb02fe7634830dd3b3157901eb1179bf90f9 not found: ID does not exist" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.595139 4707 scope.go:117] "RemoveContainer" containerID="489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e" Jan 29 03:31:10 crc kubenswrapper[4707]: E0129 03:31:10.595425 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e\": container with ID starting with 489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e not found: ID does not exist" containerID="489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.595489 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e"} err="failed to get container status \"489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e\": rpc error: code = NotFound desc = could not find container \"489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e\": container with ID starting with 489dc18c561e847103957d1c4cfcecbecce27230ae4ae7451d6c9f75e045069e not found: ID does not exist" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.595523 4707 scope.go:117] "RemoveContainer" containerID="a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af" Jan 29 03:31:10 crc kubenswrapper[4707]: E0129 03:31:10.595828 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af\": container with ID starting with a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af not found: ID does not exist" containerID="a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.595853 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af"} err="failed to get container status \"a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af\": rpc error: code = NotFound desc = could not find container \"a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af\": container with ID starting with a0bca643e0d49f375a49f11e199d1ec40e37a7bd5736741d7ae0aaa72f27b2af not found: ID does not exist" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.595866 4707 scope.go:117] "RemoveContainer" containerID="b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.634491 4707 scope.go:117] "RemoveContainer" containerID="f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.655437 4707 scope.go:117] "RemoveContainer" containerID="feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.673884 4707 scope.go:117] "RemoveContainer" containerID="b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f" Jan 29 03:31:10 crc kubenswrapper[4707]: E0129 03:31:10.674441 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f\": container with ID starting with b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f not found: ID does not exist" containerID="b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.674510 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f"} err="failed to get container status \"b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f\": rpc error: code = NotFound desc = could not find container \"b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f\": container with ID starting with b7d4b814ed497c1bdd6541205bf0e497c63bae43f5650680d4f787676d89398f not found: ID does not exist" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.674647 4707 scope.go:117] "RemoveContainer" containerID="f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335" Jan 29 03:31:10 crc kubenswrapper[4707]: E0129 03:31:10.675169 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335\": container with ID starting with f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335 not found: ID does not exist" containerID="f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.675235 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335"} err="failed to get container status \"f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335\": rpc error: code = NotFound desc = could not find container \"f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335\": container with ID starting with f40434eb8d2f53c3927d1c0bc348e6be1daef062815b5428d2d5b267a1952335 not found: ID does not exist" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.675273 4707 scope.go:117] "RemoveContainer" containerID="feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6" Jan 29 03:31:10 crc kubenswrapper[4707]: E0129 03:31:10.676425 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6\": container with ID starting with feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6 not found: ID does not exist" containerID="feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.676485 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6"} err="failed to get container status \"feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6\": rpc error: code = NotFound desc = could not find container \"feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6\": container with ID starting with feacaecf002a2b2af69d2815ba3ff947aa090d701b1fb851af937b0a5f0713d6 not found: ID does not exist" Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.825391 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ss5sc"] Jan 29 03:31:10 crc kubenswrapper[4707]: I0129 03:31:10.829183 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ss5sc"] Jan 29 03:31:11 crc kubenswrapper[4707]: I0129 03:31:11.253663 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" path="/var/lib/kubelet/pods/b8e6b828-3c69-4f2d-95f6-2c1f6294a17e/volumes" Jan 29 03:31:11 crc kubenswrapper[4707]: I0129 03:31:11.254381 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" path="/var/lib/kubelet/pods/c24271ec-27aa-4e94-8244-9b05496687ee/volumes" Jan 29 03:31:11 crc kubenswrapper[4707]: I0129 03:31:11.674216 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ljzgq"] Jan 29 03:31:12 crc kubenswrapper[4707]: I0129 03:31:12.514301 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ljzgq" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerName="registry-server" containerID="cri-o://5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f" gracePeriod=2 Jan 29 03:31:12 crc kubenswrapper[4707]: I0129 03:31:12.865199 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:31:12 crc kubenswrapper[4707]: I0129 03:31:12.928578 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-catalog-content\") pod \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " Jan 29 03:31:12 crc kubenswrapper[4707]: I0129 03:31:12.928748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-utilities\") pod \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " Jan 29 03:31:12 crc kubenswrapper[4707]: I0129 03:31:12.928881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vv8w\" (UniqueName: \"kubernetes.io/projected/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-kube-api-access-9vv8w\") pod \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\" (UID: \"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d\") " Jan 29 03:31:12 crc kubenswrapper[4707]: I0129 03:31:12.930073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-utilities" (OuterVolumeSpecName: "utilities") pod "231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" (UID: "231f9fe1-a85f-4ac8-928d-5e76e3b45c9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:31:12 crc kubenswrapper[4707]: I0129 03:31:12.937970 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-kube-api-access-9vv8w" (OuterVolumeSpecName: "kube-api-access-9vv8w") pod "231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" (UID: "231f9fe1-a85f-4ac8-928d-5e76e3b45c9d"). InnerVolumeSpecName "kube-api-access-9vv8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:31:12 crc kubenswrapper[4707]: I0129 03:31:12.972632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" (UID: "231f9fe1-a85f-4ac8-928d-5e76e3b45c9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.030509 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vv8w\" (UniqueName: \"kubernetes.io/projected/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-kube-api-access-9vv8w\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.030939 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.031026 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.524866 4707 generic.go:334] "Generic (PLEG): container finished" podID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerID="5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f" exitCode=0 Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.524929 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ljzgq" event={"ID":"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d","Type":"ContainerDied","Data":"5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f"} Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.525068 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ljzgq" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.525881 4707 scope.go:117] "RemoveContainer" containerID="5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.525780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ljzgq" event={"ID":"231f9fe1-a85f-4ac8-928d-5e76e3b45c9d","Type":"ContainerDied","Data":"6e663d2a1ba47e2d37e06136a3bb0bee05a068c31de74a5f7813cd78cac3a0df"} Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.548734 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ljzgq"] Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.549875 4707 scope.go:117] "RemoveContainer" containerID="d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.579591 4707 scope.go:117] "RemoveContainer" containerID="9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.580235 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ljzgq"] Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.609595 4707 scope.go:117] "RemoveContainer" containerID="5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f" Jan 29 03:31:13 crc kubenswrapper[4707]: E0129 03:31:13.610282 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f\": container with ID starting with 5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f not found: ID does not exist" containerID="5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.610348 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f"} err="failed to get container status \"5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f\": rpc error: code = NotFound desc = could not find container \"5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f\": container with ID starting with 5bced86cc9e292fa5652af089ba292e25ed144dffb25ca026c2da9eef5a4de9f not found: ID does not exist" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.610385 4707 scope.go:117] "RemoveContainer" containerID="d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2" Jan 29 03:31:13 crc kubenswrapper[4707]: E0129 03:31:13.610766 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2\": container with ID starting with d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2 not found: ID does not exist" containerID="d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.610813 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2"} err="failed to get container status \"d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2\": rpc error: code = NotFound desc = could not find container \"d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2\": container with ID starting with d522c0065a74ecfc13ed48fbc1c81b712ce1435b76a8a0396a6db87ae52843b2 not found: ID does not exist" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.610849 4707 scope.go:117] "RemoveContainer" containerID="9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1" Jan 29 03:31:13 crc kubenswrapper[4707]: E0129 03:31:13.611148 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1\": container with ID starting with 9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1 not found: ID does not exist" containerID="9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1" Jan 29 03:31:13 crc kubenswrapper[4707]: I0129 03:31:13.611175 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1"} err="failed to get container status \"9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1\": rpc error: code = NotFound desc = could not find container \"9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1\": container with ID starting with 9322d92ae2728b4218040ddb34315fd5364d28ac8a60587e1264a7be31bd7de1 not found: ID does not exist" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.077190 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gcm8f"] Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.077483 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gcm8f" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" containerName="registry-server" containerID="cri-o://cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5" gracePeriod=2 Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.460756 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.535080 4707 generic.go:334] "Generic (PLEG): container finished" podID="97319f63-bd16-4f50-b2d3-65340867a92a" containerID="cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5" exitCode=0 Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.535135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcm8f" event={"ID":"97319f63-bd16-4f50-b2d3-65340867a92a","Type":"ContainerDied","Data":"cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5"} Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.535166 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gcm8f" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.535192 4707 scope.go:117] "RemoveContainer" containerID="cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.535171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcm8f" event={"ID":"97319f63-bd16-4f50-b2d3-65340867a92a","Type":"ContainerDied","Data":"5a603d8fd94f1f43e807d7fe9ab0ea6172f6d18401e2072a90c8e74d633c179f"} Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.553764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ppvl\" (UniqueName: \"kubernetes.io/projected/97319f63-bd16-4f50-b2d3-65340867a92a-kube-api-access-4ppvl\") pod \"97319f63-bd16-4f50-b2d3-65340867a92a\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.553974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-utilities\") pod \"97319f63-bd16-4f50-b2d3-65340867a92a\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.554082 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-catalog-content\") pod \"97319f63-bd16-4f50-b2d3-65340867a92a\" (UID: \"97319f63-bd16-4f50-b2d3-65340867a92a\") " Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.554828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-utilities" (OuterVolumeSpecName: "utilities") pod "97319f63-bd16-4f50-b2d3-65340867a92a" (UID: "97319f63-bd16-4f50-b2d3-65340867a92a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.558705 4707 scope.go:117] "RemoveContainer" containerID="645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.559203 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97319f63-bd16-4f50-b2d3-65340867a92a-kube-api-access-4ppvl" (OuterVolumeSpecName: "kube-api-access-4ppvl") pod "97319f63-bd16-4f50-b2d3-65340867a92a" (UID: "97319f63-bd16-4f50-b2d3-65340867a92a"). InnerVolumeSpecName "kube-api-access-4ppvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.590157 4707 scope.go:117] "RemoveContainer" containerID="d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.604355 4707 scope.go:117] "RemoveContainer" containerID="cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5" Jan 29 03:31:14 crc kubenswrapper[4707]: E0129 03:31:14.605012 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5\": container with ID starting with cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5 not found: ID does not exist" containerID="cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.605236 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5"} err="failed to get container status \"cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5\": rpc error: code = NotFound desc = could not find container \"cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5\": container with ID starting with cfe1fdb846ee6271d13c3d6b81362ff14d3c5d8d2f4e0848bec27e4a8c2031b5 not found: ID does not exist" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.605262 4707 scope.go:117] "RemoveContainer" containerID="645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003" Jan 29 03:31:14 crc kubenswrapper[4707]: E0129 03:31:14.605462 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003\": container with ID starting with 645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003 not found: ID does not exist" containerID="645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.605487 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003"} err="failed to get container status \"645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003\": rpc error: code = NotFound desc = could not find container \"645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003\": container with ID starting with 645d35f8662eba0d390e240628aa64b180d330e849a4be8e929f1c0d9aa37003 not found: ID does not exist" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.605501 4707 scope.go:117] "RemoveContainer" containerID="d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc" Jan 29 03:31:14 crc kubenswrapper[4707]: E0129 03:31:14.606037 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc\": container with ID starting with d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc not found: ID does not exist" containerID="d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.606067 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc"} err="failed to get container status \"d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc\": rpc error: code = NotFound desc = could not find container \"d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc\": container with ID starting with d937a438341383d73b21c28e9513a859cd80c1cf7ceec641181abc65fa1365dc not found: ID does not exist" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.654859 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.654904 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ppvl\" (UniqueName: \"kubernetes.io/projected/97319f63-bd16-4f50-b2d3-65340867a92a-kube-api-access-4ppvl\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.706008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97319f63-bd16-4f50-b2d3-65340867a92a" (UID: "97319f63-bd16-4f50-b2d3-65340867a92a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.756119 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97319f63-bd16-4f50-b2d3-65340867a92a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.863428 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gcm8f"] Jan 29 03:31:14 crc kubenswrapper[4707]: I0129 03:31:14.874594 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gcm8f"] Jan 29 03:31:15 crc kubenswrapper[4707]: I0129 03:31:15.249853 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" path="/var/lib/kubelet/pods/231f9fe1-a85f-4ac8-928d-5e76e3b45c9d/volumes" Jan 29 03:31:15 crc kubenswrapper[4707]: I0129 03:31:15.250643 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" path="/var/lib/kubelet/pods/97319f63-bd16-4f50-b2d3-65340867a92a/volumes" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.724284 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725191 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725262 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725277 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerName="extract-utilities" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725287 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerName="extract-utilities" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725301 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerName="extract-content" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725309 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerName="extract-content" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725318 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" containerName="extract-content" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725326 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" containerName="extract-content" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725344 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725353 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725369 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerName="extract-utilities" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725379 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerName="extract-utilities" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725392 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725400 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725410 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" containerName="extract-content" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725419 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" containerName="extract-content" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725435 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725444 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725457 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" containerName="extract-utilities" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725466 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" containerName="extract-utilities" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725478 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" containerName="extract-utilities" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725486 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" containerName="extract-utilities" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.725501 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerName="extract-content" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725509 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerName="extract-content" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725687 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="97319f63-bd16-4f50-b2d3-65340867a92a" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725706 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24271ec-27aa-4e94-8244-9b05496687ee" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725718 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="231f9fe1-a85f-4ac8-928d-5e76e3b45c9d" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.725735 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e6b828-3c69-4f2d-95f6-2c1f6294a17e" containerName="registry-server" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.726241 4707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.726796 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f" gracePeriod=15 Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.727039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.727278 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39" gracePeriod=15 Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.727420 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075" gracePeriod=15 Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.727512 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a" gracePeriod=15 Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.727408 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f" gracePeriod=15 Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.727820 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.728251 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728277 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.728298 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728311 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.728329 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728341 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.728364 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728376 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.728397 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728411 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.728424 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728439 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 03:31:26 crc kubenswrapper[4707]: E0129 03:31:26.728461 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728476 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728719 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728739 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728756 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728769 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.728787 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.729142 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.778403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.778504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.778602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.778653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.778721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.778762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.778831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.779223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.783036 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.881225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.881364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.881416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.881560 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.881647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.881861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.881723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.881783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.881729 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.882045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.882146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.882093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.882245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.882265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.882295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:26 crc kubenswrapper[4707]: I0129 03:31:26.882382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.079050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:31:27 crc kubenswrapper[4707]: E0129 03:31:27.124785 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f161bc6171b1d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-29 03:31:27.123225373 +0000 UTC m=+240.607454318,LastTimestamp:2026-01-29 03:31:27.123225373 +0000 UTC m=+240.607454318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.248825 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.249682 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: E0129 03:31:27.621979 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: E0129 03:31:27.622494 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: E0129 03:31:27.623345 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: E0129 03:31:27.623911 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: E0129 03:31:27.624387 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.624459 4707 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 29 03:31:27 crc kubenswrapper[4707]: E0129 03:31:27.625009 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="200ms" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.635106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900"} Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.635159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c764ee646f440cf1313215c2ff51704ce2b8422d717b4cf8d88425e66d283d8e"} Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.635786 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.638528 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.640594 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.641788 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39" exitCode=0 Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.641835 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a" exitCode=0 Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.641853 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f" exitCode=0 Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.641871 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075" exitCode=2 Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.641879 4707 scope.go:117] "RemoveContainer" containerID="3b805bd277d4dd0d8b0e42b2e3905f176427c8cd9bc46b65d0479cb3b3f5fcd4" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.644620 4707 generic.go:334] "Generic (PLEG): container finished" podID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" containerID="f3484a35051274a01645ac1ac47bd2267a67f718eb2835bd3d48f1b2b6bb3a97" exitCode=0 Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.644663 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa","Type":"ContainerDied","Data":"f3484a35051274a01645ac1ac47bd2267a67f718eb2835bd3d48f1b2b6bb3a97"} Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.645198 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: I0129 03:31:27.645855 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:27 crc kubenswrapper[4707]: E0129 03:31:27.826816 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="400ms" Jan 29 03:31:28 crc kubenswrapper[4707]: E0129 03:31:28.228954 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="800ms" Jan 29 03:31:28 crc kubenswrapper[4707]: I0129 03:31:28.654752 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 03:31:29 crc kubenswrapper[4707]: E0129 03:31:29.029953 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="1.6s" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.113762 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.114997 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.115531 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.119844 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.121152 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.121685 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.124242 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.124778 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.221471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.221614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-var-lock\") pod \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.221680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kube-api-access\") pod \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.221762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kubelet-dir\") pod \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\" (UID: \"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa\") " Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.221671 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.221793 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-var-lock" (OuterVolumeSpecName: "var-lock") pod "d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" (UID: "d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.221931 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.221962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.221967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" (UID: "d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.222009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.222108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.222503 4707 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.222578 4707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.222604 4707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.222621 4707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-var-lock\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.222639 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.229998 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" (UID: "d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.251829 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.324784 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.669233 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.670727 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f" exitCode=0 Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.670809 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.670868 4707 scope.go:117] "RemoveContainer" containerID="88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.672004 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.672457 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.673309 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.675075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa","Type":"ContainerDied","Data":"31ef6bda8ae4030fea8779581ffb0aa8bc93d59a2462b312417e0fe568a92dcc"} Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.675163 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.675163 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ef6bda8ae4030fea8779581ffb0aa8bc93d59a2462b312417e0fe568a92dcc" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.678141 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.679074 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.679879 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.684508 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.685099 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.685563 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.701315 4707 scope.go:117] "RemoveContainer" containerID="5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.726330 4707 scope.go:117] "RemoveContainer" containerID="c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.748306 4707 scope.go:117] "RemoveContainer" containerID="64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.768612 4707 scope.go:117] "RemoveContainer" containerID="abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.792437 4707 scope.go:117] "RemoveContainer" containerID="53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.826029 4707 scope.go:117] "RemoveContainer" containerID="88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39" Jan 29 03:31:29 crc kubenswrapper[4707]: E0129 03:31:29.827305 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\": container with ID starting with 88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39 not found: ID does not exist" containerID="88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.827340 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39"} err="failed to get container status \"88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\": rpc error: code = NotFound desc = could not find container \"88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39\": container with ID starting with 88b4a2807e2431193ec9a5bd1d1700d792159e2f3fad9d2bf4b3945f017d1b39 not found: ID does not exist" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.827368 4707 scope.go:117] "RemoveContainer" containerID="5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a" Jan 29 03:31:29 crc kubenswrapper[4707]: E0129 03:31:29.827759 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\": container with ID starting with 5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a not found: ID does not exist" containerID="5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.827779 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a"} err="failed to get container status \"5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\": rpc error: code = NotFound desc = could not find container \"5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a\": container with ID starting with 5514aa242cc079f38c6225890ee14afe035efeed3243dc805c482e316ccd943a not found: ID does not exist" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.827795 4707 scope.go:117] "RemoveContainer" containerID="c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f" Jan 29 03:31:29 crc kubenswrapper[4707]: E0129 03:31:29.828035 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\": container with ID starting with c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f not found: ID does not exist" containerID="c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.828054 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f"} err="failed to get container status \"c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\": rpc error: code = NotFound desc = could not find container \"c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f\": container with ID starting with c65f0cffe192dfcd9a9b538d53a594883c8eb91628cc5bc36834c09b86282a1f not found: ID does not exist" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.828067 4707 scope.go:117] "RemoveContainer" containerID="64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075" Jan 29 03:31:29 crc kubenswrapper[4707]: E0129 03:31:29.828385 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\": container with ID starting with 64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075 not found: ID does not exist" containerID="64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.828409 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075"} err="failed to get container status \"64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\": rpc error: code = NotFound desc = could not find container \"64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075\": container with ID starting with 64a8dfc67a932fd222dd5a6638061fddb07ea5166ae43aa12f60ce70a763f075 not found: ID does not exist" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.828423 4707 scope.go:117] "RemoveContainer" containerID="abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f" Jan 29 03:31:29 crc kubenswrapper[4707]: E0129 03:31:29.829164 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\": container with ID starting with abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f not found: ID does not exist" containerID="abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.829204 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f"} err="failed to get container status \"abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\": rpc error: code = NotFound desc = could not find container \"abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f\": container with ID starting with abe321a3aa1924adff91311204753b290d3089acffe8b2dd9fa78ba38c62441f not found: ID does not exist" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.829226 4707 scope.go:117] "RemoveContainer" containerID="53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6" Jan 29 03:31:29 crc kubenswrapper[4707]: E0129 03:31:29.829905 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\": container with ID starting with 53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6 not found: ID does not exist" containerID="53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6" Jan 29 03:31:29 crc kubenswrapper[4707]: I0129 03:31:29.829936 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6"} err="failed to get container status \"53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\": rpc error: code = NotFound desc = could not find container \"53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6\": container with ID starting with 53e76f861e7946849e2eb4a4d8bdd8bdfc0a6d2ec50b98255be6536dcd95fec6 not found: ID does not exist" Jan 29 03:31:30 crc kubenswrapper[4707]: E0129 03:31:30.631722 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="3.2s" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.438124 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" containerName="oauth-openshift" containerID="cri-o://35f369920fccfbd42076ae282332aa0acfcfc5b674689300e1a416cf0deb5c7b" gracePeriod=15 Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.700112 4707 generic.go:334] "Generic (PLEG): container finished" podID="9194d298-b1b5-4b06-9254-b484dc1a1382" containerID="35f369920fccfbd42076ae282332aa0acfcfc5b674689300e1a416cf0deb5c7b" exitCode=0 Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.700212 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" event={"ID":"9194d298-b1b5-4b06-9254-b484dc1a1382","Type":"ContainerDied","Data":"35f369920fccfbd42076ae282332aa0acfcfc5b674689300e1a416cf0deb5c7b"} Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.781154 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.782083 4707 status_manager.go:851] "Failed to get status for pod" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-k6sdj\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.782803 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.784717 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-dir\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-idp-0-file-data\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875345 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-ocp-branding-template\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-error\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875385 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-login\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-policies\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-session\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-service-ca\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxjqw\" (UniqueName: \"kubernetes.io/projected/9194d298-b1b5-4b06-9254-b484dc1a1382-kube-api-access-jxjqw\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-router-certs\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-trusted-ca-bundle\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-provider-selection\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-serving-cert\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-cliconfig\") pod \"9194d298-b1b5-4b06-9254-b484dc1a1382\" (UID: \"9194d298-b1b5-4b06-9254-b484dc1a1382\") " Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.875952 4707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.876832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.877125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.877581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.877921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.882925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.883850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.885270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.885705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9194d298-b1b5-4b06-9254-b484dc1a1382-kube-api-access-jxjqw" (OuterVolumeSpecName: "kube-api-access-jxjqw") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "kube-api-access-jxjqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.887365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.895951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.896104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.896407 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.897031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9194d298-b1b5-4b06-9254-b484dc1a1382" (UID: "9194d298-b1b5-4b06-9254-b484dc1a1382"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977419 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977464 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977481 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977497 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxjqw\" (UniqueName: \"kubernetes.io/projected/9194d298-b1b5-4b06-9254-b484dc1a1382-kube-api-access-jxjqw\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977509 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977522 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977564 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977580 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977593 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977645 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977659 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977672 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:32 crc kubenswrapper[4707]: I0129 03:31:32.977687 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9194d298-b1b5-4b06-9254-b484dc1a1382-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 29 03:31:33 crc kubenswrapper[4707]: I0129 03:31:33.711095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" event={"ID":"9194d298-b1b5-4b06-9254-b484dc1a1382","Type":"ContainerDied","Data":"c5234fe324856261d88262adb19a7dc37223978ecf3275ecd3306a8e6085f70a"} Jan 29 03:31:33 crc kubenswrapper[4707]: I0129 03:31:33.711177 4707 scope.go:117] "RemoveContainer" containerID="35f369920fccfbd42076ae282332aa0acfcfc5b674689300e1a416cf0deb5c7b" Jan 29 03:31:33 crc kubenswrapper[4707]: I0129 03:31:33.711374 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" Jan 29 03:31:33 crc kubenswrapper[4707]: I0129 03:31:33.712461 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:33 crc kubenswrapper[4707]: I0129 03:31:33.714883 4707 status_manager.go:851] "Failed to get status for pod" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-k6sdj\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:33 crc kubenswrapper[4707]: I0129 03:31:33.715384 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:33 crc kubenswrapper[4707]: I0129 03:31:33.719927 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:33 crc kubenswrapper[4707]: I0129 03:31:33.720759 4707 status_manager.go:851] "Failed to get status for pod" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-k6sdj\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:33 crc kubenswrapper[4707]: I0129 03:31:33.721462 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:33 crc kubenswrapper[4707]: E0129 03:31:33.833095 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="6.4s" Jan 29 03:31:35 crc kubenswrapper[4707]: E0129 03:31:35.958788 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f161bc6171b1d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-29 03:31:27.123225373 +0000 UTC m=+240.607454318,LastTimestamp:2026-01-29 03:31:27.123225373 +0000 UTC m=+240.607454318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 29 03:31:37 crc kubenswrapper[4707]: I0129 03:31:37.246317 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:37 crc kubenswrapper[4707]: I0129 03:31:37.247142 4707 status_manager.go:851] "Failed to get status for pod" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-k6sdj\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:37 crc kubenswrapper[4707]: I0129 03:31:37.247806 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:38 crc kubenswrapper[4707]: E0129 03:31:38.248568 4707 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" volumeName="registry-storage" Jan 29 03:31:40 crc kubenswrapper[4707]: E0129 03:31:40.234275 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="7s" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.242661 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.243830 4707 status_manager.go:851] "Failed to get status for pod" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-k6sdj\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.244352 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.245001 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.259137 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.259185 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:40 crc kubenswrapper[4707]: E0129 03:31:40.259874 4707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.260636 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.760269 4707 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c3255e9c6553dfe7ab3f6f4af5ffa45ba621a2b0b7b37028fa99ce415d414310" exitCode=0 Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.760355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c3255e9c6553dfe7ab3f6f4af5ffa45ba621a2b0b7b37028fa99ce415d414310"} Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.760931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8543c334c42488220435185465c6a96b9d8480e0f66053e9bf5478880f578873"} Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.761384 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.761409 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.761931 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:40 crc kubenswrapper[4707]: E0129 03:31:40.761979 4707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.762268 4707 status_manager.go:851] "Failed to get status for pod" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-k6sdj\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.762776 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.765494 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.765588 4707 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb" exitCode=1 Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.765627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb"} Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.766288 4707 scope.go:117] "RemoveContainer" containerID="80acc674d308af032d5311c5e9d48f349c0a03aab03283151fb79e4062f710bb" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.766654 4707 status_manager.go:851] "Failed to get status for pod" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" pod="openshift-authentication/oauth-openshift-558db77b4-k6sdj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-k6sdj\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.767212 4707 status_manager.go:851] "Failed to get status for pod" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.767448 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:40 crc kubenswrapper[4707]: I0129 03:31:40.767662 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Jan 29 03:31:41 crc kubenswrapper[4707]: I0129 03:31:41.138925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 03:31:41 crc kubenswrapper[4707]: I0129 03:31:41.780665 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 29 03:31:41 crc kubenswrapper[4707]: I0129 03:31:41.781223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4f64a8f8a8250062124d32cfb778357e8ae6f3742eaf06b30e1ba2b28086fd29"} Jan 29 03:31:41 crc kubenswrapper[4707]: I0129 03:31:41.784931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0cbe52ad494f8164e62e06cb9cfb09583e35aae3708d0c4eebb64df74be241a5"} Jan 29 03:31:41 crc kubenswrapper[4707]: I0129 03:31:41.784986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d9cbda9cf1d821f1c287ebd84b9a1bc4896e94119691136c7b6419bbadb1ff28"} Jan 29 03:31:41 crc kubenswrapper[4707]: I0129 03:31:41.785001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4d05052fd7c2e590f3ea8040aae0149d2337ea7d95d1dfb2c301a28cd8309e67"} Jan 29 03:31:41 crc kubenswrapper[4707]: I0129 03:31:41.785017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ed2a6652b549c1ba6b8b707125e12d5ceb373fea3f760a37f103d6c60fa939a7"} Jan 29 03:31:42 crc kubenswrapper[4707]: I0129 03:31:42.793398 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77b962557f355be8de2e2a6e0cff7c4b68f4a83d8d781bb7505e33ca607d6e64"} Jan 29 03:31:42 crc kubenswrapper[4707]: I0129 03:31:42.793896 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:42 crc kubenswrapper[4707]: I0129 03:31:42.793922 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:45 crc kubenswrapper[4707]: I0129 03:31:45.261091 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:45 crc kubenswrapper[4707]: I0129 03:31:45.261451 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:45 crc kubenswrapper[4707]: I0129 03:31:45.269202 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:47 crc kubenswrapper[4707]: I0129 03:31:47.821006 4707 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:47 crc kubenswrapper[4707]: I0129 03:31:47.830355 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6f9db2-7632-4f90-85d7-bd8dc14a1fca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:31:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:31:40Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:31:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T03:31:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed2a6652b549c1ba6b8b707125e12d5ceb373fea3f760a37f103d6c60fa939a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cbda9cf1d821f1c287ebd84b9a1bc4896e94119691136c7b6419bbadb1ff28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d05052fd7c2e590f3ea8040aae0149d2337ea7d95d1dfb2c301a28cd8309e67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77b962557f355be8de2e2a6e0cff7c4b68f4a83d8d781bb7505e33ca607d6e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cbe52ad494f8164e62e06cb9cfb09583e35aae3708d0c4eebb64df74be241a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T03:31:41Z\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3255e9c6553dfe7ab3f6f4af5ffa45ba621a2b0b7b37028fa99ce415d414310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3255e9c6553dfe7ab3f6f4af5ffa45ba621a2b0b7b37028fa99ce415d414310\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T03:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T03:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}]}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Jan 29 03:31:48 crc kubenswrapper[4707]: I0129 03:31:48.833504 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:48 crc kubenswrapper[4707]: I0129 03:31:48.833520 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:48 crc kubenswrapper[4707]: I0129 03:31:48.833578 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:48 crc kubenswrapper[4707]: I0129 03:31:48.842389 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:48 crc kubenswrapper[4707]: I0129 03:31:48.846985 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0d6c1cc4-b66b-4d63-abec-9ce0c735ac6c" Jan 29 03:31:49 crc kubenswrapper[4707]: I0129 03:31:49.509742 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 03:31:49 crc kubenswrapper[4707]: I0129 03:31:49.839273 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:49 crc kubenswrapper[4707]: I0129 03:31:49.839748 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:49 crc kubenswrapper[4707]: I0129 03:31:49.842634 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0d6c1cc4-b66b-4d63-abec-9ce0c735ac6c" Jan 29 03:31:50 crc kubenswrapper[4707]: I0129 03:31:50.845864 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:50 crc kubenswrapper[4707]: I0129 03:31:50.845914 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8d6f9db2-7632-4f90-85d7-bd8dc14a1fca" Jan 29 03:31:50 crc kubenswrapper[4707]: I0129 03:31:50.850159 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0d6c1cc4-b66b-4d63-abec-9ce0c735ac6c" Jan 29 03:31:51 crc kubenswrapper[4707]: I0129 03:31:51.138879 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 03:31:51 crc kubenswrapper[4707]: I0129 03:31:51.145107 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 03:31:51 crc kubenswrapper[4707]: I0129 03:31:51.860561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 03:31:57 crc kubenswrapper[4707]: I0129 03:31:57.615702 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 29 03:31:57 crc kubenswrapper[4707]: I0129 03:31:57.979209 4707 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 29 03:31:58 crc kubenswrapper[4707]: I0129 03:31:58.246051 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 29 03:31:58 crc kubenswrapper[4707]: I0129 03:31:58.385378 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 29 03:31:58 crc kubenswrapper[4707]: I0129 03:31:58.523280 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 29 03:31:58 crc kubenswrapper[4707]: I0129 03:31:58.999769 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.116621 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.263364 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.280867 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.323984 4707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.326554 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=33.32651112 podStartE2EDuration="33.32651112s" podCreationTimestamp="2026-01-29 03:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:31:47.73398957 +0000 UTC m=+261.218218485" watchObservedRunningTime="2026-01-29 03:31:59.32651112 +0000 UTC m=+272.810740025" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.329138 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-k6sdj"] Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.329202 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.338910 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.354067 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.418861 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.418839782 podStartE2EDuration="12.418839782s" podCreationTimestamp="2026-01-29 03:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:31:59.354521478 +0000 UTC m=+272.838750383" watchObservedRunningTime="2026-01-29 03:31:59.418839782 +0000 UTC m=+272.903068697" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.443717 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.582964 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.640135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.656274 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.677121 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.731258 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 29 03:31:59 crc kubenswrapper[4707]: I0129 03:31:59.896987 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.148810 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.204779 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.291606 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.606156 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.619046 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.655243 4707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.669523 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.741641 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.772255 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.773392 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.832665 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.887636 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.898464 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 29 03:32:00 crc kubenswrapper[4707]: I0129 03:32:00.913232 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.113896 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.140849 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.229304 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.257050 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" path="/var/lib/kubelet/pods/9194d298-b1b5-4b06-9254-b484dc1a1382/volumes" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.260191 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.305747 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.349723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.426946 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.433857 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.628720 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.663080 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.693368 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.733433 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.794761 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.837048 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.847937 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.966125 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 29 03:32:01 crc kubenswrapper[4707]: I0129 03:32:01.966190 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.009741 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.056107 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.074610 4707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.168897 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.202681 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.583339 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.585294 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.658346 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.697063 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.752490 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.894720 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.905977 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.968095 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 29 03:32:02 crc kubenswrapper[4707]: I0129 03:32:02.998853 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.118243 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.162873 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.204158 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.246902 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.598203 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.601727 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.849369 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.868584 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.955094 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.974593 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 29 03:32:03 crc kubenswrapper[4707]: I0129 03:32:03.987998 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.054339 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.099238 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.100424 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.122484 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.191410 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.290412 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.322407 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.326804 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.399824 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.401730 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.440268 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.464440 4707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.472787 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.608338 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.617021 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.653515 4707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.718835 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.834265 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.881769 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 29 03:32:04 crc kubenswrapper[4707]: I0129 03:32:04.969363 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.188125 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.318390 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.377150 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.477475 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.524724 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.628702 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.636867 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.686657 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.721083 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.743643 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.841630 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 29 03:32:05 crc kubenswrapper[4707]: I0129 03:32:05.956255 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.001969 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.015605 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.082144 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.085992 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.104673 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.231026 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.311851 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.357951 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.377494 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.448137 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.492917 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.524465 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.575046 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.699051 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.800321 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.817314 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.875218 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 29 03:32:06 crc kubenswrapper[4707]: I0129 03:32:06.950724 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.031807 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.076561 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.191791 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.240507 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.366292 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.461853 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.533305 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.534904 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.564092 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.592267 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.598984 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.604319 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.618784 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.642828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.692286 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.769320 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.828272 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.857151 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.957825 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 29 03:32:07 crc kubenswrapper[4707]: I0129 03:32:07.974580 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.057052 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.116367 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.145413 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.221936 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.305090 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.339611 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.508073 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.533790 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.546530 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.718858 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.750079 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.862683 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.916212 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 29 03:32:08 crc kubenswrapper[4707]: I0129 03:32:08.979839 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.197742 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.218121 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.252671 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.271716 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.272431 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.298628 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.433213 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.535462 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.592103 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.621935 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.676051 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.685790 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.716221 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.720255 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.749745 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.794716 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.864951 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.956757 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 29 03:32:09 crc kubenswrapper[4707]: I0129 03:32:09.984942 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.084613 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.145227 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.153881 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.239924 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.294442 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.302222 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.344815 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.377126 4707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.377494 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900" gracePeriod=5 Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.511107 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.537297 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.550287 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.556683 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.636945 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6fffd54687-82wfd"] Jan 29 03:32:10 crc kubenswrapper[4707]: E0129 03:32:10.637276 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.637294 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 29 03:32:10 crc kubenswrapper[4707]: E0129 03:32:10.637310 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" containerName="installer" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.637319 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" containerName="installer" Jan 29 03:32:10 crc kubenswrapper[4707]: E0129 03:32:10.637331 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" containerName="oauth-openshift" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.637339 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" containerName="oauth-openshift" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.637999 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.638921 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.638993 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3eb1b99-c0fc-477f-8e5d-b2af47e9fbaa" containerName="installer" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.639016 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9194d298-b1b5-4b06-9254-b484dc1a1382" containerName="oauth-openshift" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.640952 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.645917 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.646341 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.646461 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.646651 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.648173 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.648560 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.648563 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.650381 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.652327 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.661696 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.663255 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.663829 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.666442 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.670501 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.672552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fffd54687-82wfd"] Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.673781 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.674113 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.694650 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.829510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.830525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-template-error\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.830806 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.831139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.831381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.831638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-audit-policies\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.831829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.832013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.832201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-session\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.832395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c71ce15c-88e4-4956-9d47-afd1db8d264f-audit-dir\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.832601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.832789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-template-login\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.832971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.833143 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jvjh\" (UniqueName: \"kubernetes.io/projected/c71ce15c-88e4-4956-9d47-afd1db8d264f-kube-api-access-7jvjh\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.864861 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.880844 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-audit-policies\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935275 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-session\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c71ce15c-88e4-4956-9d47-afd1db8d264f-audit-dir\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935381 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-template-login\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jvjh\" (UniqueName: \"kubernetes.io/projected/c71ce15c-88e4-4956-9d47-afd1db8d264f-kube-api-access-7jvjh\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935446 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.935491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-template-error\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.936238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.937513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-audit-policies\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.937642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c71ce15c-88e4-4956-9d47-afd1db8d264f-audit-dir\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.938580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.938679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.944488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.944603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.944985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-template-error\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.945946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-session\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.946911 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.954143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-user-template-login\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.954402 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.957623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jvjh\" (UniqueName: \"kubernetes.io/projected/c71ce15c-88e4-4956-9d47-afd1db8d264f-kube-api-access-7jvjh\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.957901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c71ce15c-88e4-4956-9d47-afd1db8d264f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fffd54687-82wfd\" (UID: \"c71ce15c-88e4-4956-9d47-afd1db8d264f\") " pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.978016 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.985492 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 29 03:32:10 crc kubenswrapper[4707]: I0129 03:32:10.990777 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.226424 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.313011 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.357453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.361794 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.377568 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.418054 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.454840 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.485069 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.490444 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.517721 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.545910 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.611101 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fffd54687-82wfd"] Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.643172 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.710161 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.744317 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.915449 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.996169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" event={"ID":"c71ce15c-88e4-4956-9d47-afd1db8d264f","Type":"ContainerStarted","Data":"0323b89c5645d3aaaffb4eb3e13dce2204d318bbca1df7c627d75644f43e1cff"} Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.996233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" event={"ID":"c71ce15c-88e4-4956-9d47-afd1db8d264f","Type":"ContainerStarted","Data":"d793a1260db7d6d8a5759cfbc828a76161f2a29c5de64e52430ad9c1c4df6a79"} Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.997088 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:11 crc kubenswrapper[4707]: I0129 03:32:11.999080 4707 patch_prober.go:28] interesting pod/oauth-openshift-6fffd54687-82wfd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" start-of-body= Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.002608 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" podUID="c71ce15c-88e4-4956-9d47-afd1db8d264f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.015806 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.042439 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" podStartSLOduration=65.042410067 podStartE2EDuration="1m5.042410067s" podCreationTimestamp="2026-01-29 03:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:32:12.033206072 +0000 UTC m=+285.517434977" watchObservedRunningTime="2026-01-29 03:32:12.042410067 +0000 UTC m=+285.526638972" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.148061 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.301367 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.474967 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.573477 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.613383 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.616740 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.657772 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.678880 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.695390 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.727094 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.884060 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 29 03:32:12 crc kubenswrapper[4707]: I0129 03:32:12.921532 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.008100 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6fffd54687-82wfd" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.085318 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.185020 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.216262 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.277807 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.424303 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.554918 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.842736 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.868243 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 29 03:32:13 crc kubenswrapper[4707]: I0129 03:32:13.926575 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 29 03:32:14 crc kubenswrapper[4707]: I0129 03:32:14.302971 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 29 03:32:14 crc kubenswrapper[4707]: I0129 03:32:14.363889 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 29 03:32:15 crc kubenswrapper[4707]: I0129 03:32:15.428447 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 29 03:32:15 crc kubenswrapper[4707]: I0129 03:32:15.970575 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 29 03:32:15 crc kubenswrapper[4707]: I0129 03:32:15.970982 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.022933 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.022997 4707 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900" exitCode=137 Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.023065 4707 scope.go:117] "RemoveContainer" containerID="bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.023205 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.041117 4707 scope.go:117] "RemoveContainer" containerID="bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900" Jan 29 03:32:16 crc kubenswrapper[4707]: E0129 03:32:16.041575 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900\": container with ID starting with bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900 not found: ID does not exist" containerID="bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.041612 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900"} err="failed to get container status \"bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900\": rpc error: code = NotFound desc = could not find container \"bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900\": container with ID starting with bd52dc703895d00d4b8d4c5ca327180f3d68de3f53b2cdb0895e0c99d0242900 not found: ID does not exist" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.109521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.109611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.109638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.109681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.109773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.110086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.110123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.110938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.110977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.128886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.211906 4707 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.211951 4707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.211962 4707 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.211976 4707 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.211984 4707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.308764 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 29 03:32:16 crc kubenswrapper[4707]: I0129 03:32:16.711101 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 29 03:32:17 crc kubenswrapper[4707]: I0129 03:32:17.250584 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 29 03:32:17 crc kubenswrapper[4707]: I0129 03:32:17.250843 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 29 03:32:17 crc kubenswrapper[4707]: I0129 03:32:17.261438 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 03:32:17 crc kubenswrapper[4707]: I0129 03:32:17.261509 4707 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="666a6dbc-84fd-4ac5-9e65-ff09762ec705" Jan 29 03:32:17 crc kubenswrapper[4707]: I0129 03:32:17.265624 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 03:32:17 crc kubenswrapper[4707]: I0129 03:32:17.265674 4707 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="666a6dbc-84fd-4ac5-9e65-ff09762ec705" Jan 29 03:32:26 crc kubenswrapper[4707]: I0129 03:32:26.997479 4707 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.264798 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-h88jj"] Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.265841 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" podUID="69e542f9-e9bf-424e-9d2c-852baf887b17" containerName="controller-manager" containerID="cri-o://9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443" gracePeriod=30 Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.353211 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn"] Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.353501 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" podUID="6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" containerName="route-controller-manager" containerID="cri-o://a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052" gracePeriod=30 Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.689274 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.753466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e542f9-e9bf-424e-9d2c-852baf887b17-serving-cert\") pod \"69e542f9-e9bf-424e-9d2c-852baf887b17\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.753704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-proxy-ca-bundles\") pod \"69e542f9-e9bf-424e-9d2c-852baf887b17\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.753761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-client-ca\") pod \"69e542f9-e9bf-424e-9d2c-852baf887b17\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.753783 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-config\") pod \"69e542f9-e9bf-424e-9d2c-852baf887b17\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.753840 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkwpq\" (UniqueName: \"kubernetes.io/projected/69e542f9-e9bf-424e-9d2c-852baf887b17-kube-api-access-tkwpq\") pod \"69e542f9-e9bf-424e-9d2c-852baf887b17\" (UID: \"69e542f9-e9bf-424e-9d2c-852baf887b17\") " Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.755375 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "69e542f9-e9bf-424e-9d2c-852baf887b17" (UID: "69e542f9-e9bf-424e-9d2c-852baf887b17"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.755429 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.755582 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-config" (OuterVolumeSpecName: "config") pod "69e542f9-e9bf-424e-9d2c-852baf887b17" (UID: "69e542f9-e9bf-424e-9d2c-852baf887b17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.757288 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-client-ca" (OuterVolumeSpecName: "client-ca") pod "69e542f9-e9bf-424e-9d2c-852baf887b17" (UID: "69e542f9-e9bf-424e-9d2c-852baf887b17"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.762299 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e542f9-e9bf-424e-9d2c-852baf887b17-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "69e542f9-e9bf-424e-9d2c-852baf887b17" (UID: "69e542f9-e9bf-424e-9d2c-852baf887b17"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.765300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e542f9-e9bf-424e-9d2c-852baf887b17-kube-api-access-tkwpq" (OuterVolumeSpecName: "kube-api-access-tkwpq") pod "69e542f9-e9bf-424e-9d2c-852baf887b17" (UID: "69e542f9-e9bf-424e-9d2c-852baf887b17"). InnerVolumeSpecName "kube-api-access-tkwpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.855620 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwzx4\" (UniqueName: \"kubernetes.io/projected/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-kube-api-access-kwzx4\") pod \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.855778 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-client-ca\") pod \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.855814 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-serving-cert\") pod \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.855857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-config\") pod \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\" (UID: \"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6\") " Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.856126 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.856139 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.856149 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e542f9-e9bf-424e-9d2c-852baf887b17-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.856159 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkwpq\" (UniqueName: \"kubernetes.io/projected/69e542f9-e9bf-424e-9d2c-852baf887b17-kube-api-access-tkwpq\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.856170 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e542f9-e9bf-424e-9d2c-852baf887b17-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.857072 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-config" (OuterVolumeSpecName: "config") pod "6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" (UID: "6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.857637 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" (UID: "6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.859990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" (UID: "6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.860038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-kube-api-access-kwzx4" (OuterVolumeSpecName: "kube-api-access-kwzx4") pod "6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" (UID: "6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6"). InnerVolumeSpecName "kube-api-access-kwzx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.957296 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwzx4\" (UniqueName: \"kubernetes.io/projected/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-kube-api-access-kwzx4\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.957339 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.957349 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:43 crc kubenswrapper[4707]: I0129 03:32:43.957358 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.211262 4707 generic.go:334] "Generic (PLEG): container finished" podID="6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" containerID="a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052" exitCode=0 Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.211400 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.211388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" event={"ID":"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6","Type":"ContainerDied","Data":"a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052"} Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.211652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn" event={"ID":"6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6","Type":"ContainerDied","Data":"21f2b9ce302558eceab8cd0f502113af8af26d3a3d2a7bbbcdae7ebfb5a69718"} Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.211705 4707 scope.go:117] "RemoveContainer" containerID="a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.213972 4707 generic.go:334] "Generic (PLEG): container finished" podID="69e542f9-e9bf-424e-9d2c-852baf887b17" containerID="9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443" exitCode=0 Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.214020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" event={"ID":"69e542f9-e9bf-424e-9d2c-852baf887b17","Type":"ContainerDied","Data":"9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443"} Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.214077 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" event={"ID":"69e542f9-e9bf-424e-9d2c-852baf887b17","Type":"ContainerDied","Data":"59a4aa88b6ed83ad0478ad70a74434c02d7cfe9669c0d752ec3e5a18385ea3a7"} Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.214091 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-h88jj" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.233923 4707 scope.go:117] "RemoveContainer" containerID="a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052" Jan 29 03:32:44 crc kubenswrapper[4707]: E0129 03:32:44.236123 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052\": container with ID starting with a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052 not found: ID does not exist" containerID="a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.236257 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052"} err="failed to get container status \"a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052\": rpc error: code = NotFound desc = could not find container \"a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052\": container with ID starting with a88483662ec754ab8ae8ea7c4e9078fc1fcff747359d6f61d49625bf3a98b052 not found: ID does not exist" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.236320 4707 scope.go:117] "RemoveContainer" containerID="9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.260877 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-h88jj"] Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.264234 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-h88jj"] Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.272663 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn"] Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.275450 4707 scope.go:117] "RemoveContainer" containerID="9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443" Jan 29 03:32:44 crc kubenswrapper[4707]: E0129 03:32:44.276013 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443\": container with ID starting with 9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443 not found: ID does not exist" containerID="9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.276126 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443"} err="failed to get container status \"9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443\": rpc error: code = NotFound desc = could not find container \"9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443\": container with ID starting with 9dec188bb06f60c41359169b37dae0584501cddaf0295fe2042e32885f840443 not found: ID does not exist" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.276281 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mbczn"] Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.903934 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67"] Jan 29 03:32:44 crc kubenswrapper[4707]: E0129 03:32:44.904331 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" containerName="route-controller-manager" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.904347 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" containerName="route-controller-manager" Jan 29 03:32:44 crc kubenswrapper[4707]: E0129 03:32:44.904368 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e542f9-e9bf-424e-9d2c-852baf887b17" containerName="controller-manager" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.904376 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e542f9-e9bf-424e-9d2c-852baf887b17" containerName="controller-manager" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.904499 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" containerName="route-controller-manager" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.904514 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e542f9-e9bf-424e-9d2c-852baf887b17" containerName="controller-manager" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.905310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.907974 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.908794 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.909073 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g"] Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.909510 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.909647 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.909667 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.909729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.910082 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.915019 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.915073 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.918264 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.918507 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.918811 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.919772 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.922330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67"] Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.922763 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.933843 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g"] Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.970392 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-client-ca\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.970860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-config\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.970979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-client-ca\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.971098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-config\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.971245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqzlm\" (UniqueName: \"kubernetes.io/projected/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-kube-api-access-dqzlm\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.971340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-serving-cert\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.971468 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nmwd\" (UniqueName: \"kubernetes.io/projected/579e92f7-c58e-454c-9e96-d4ae38b61e7f-kube-api-access-8nmwd\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.971573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/579e92f7-c58e-454c-9e96-d4ae38b61e7f-serving-cert\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:44 crc kubenswrapper[4707]: I0129 03:32:44.971691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-proxy-ca-bundles\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.073343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-proxy-ca-bundles\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.073754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-client-ca\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.073937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-config\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.074061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-client-ca\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.074185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-config\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.074321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqzlm\" (UniqueName: \"kubernetes.io/projected/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-kube-api-access-dqzlm\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.074451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-serving-cert\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.074608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nmwd\" (UniqueName: \"kubernetes.io/projected/579e92f7-c58e-454c-9e96-d4ae38b61e7f-kube-api-access-8nmwd\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.074745 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/579e92f7-c58e-454c-9e96-d4ae38b61e7f-serving-cert\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.075067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-client-ca\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.075464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-proxy-ca-bundles\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.075917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-client-ca\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.076229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-config\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.078325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-config\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.082004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-serving-cert\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.084291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/579e92f7-c58e-454c-9e96-d4ae38b61e7f-serving-cert\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.105202 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nmwd\" (UniqueName: \"kubernetes.io/projected/579e92f7-c58e-454c-9e96-d4ae38b61e7f-kube-api-access-8nmwd\") pod \"route-controller-manager-65b6fc7b95-ddz4g\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.107081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqzlm\" (UniqueName: \"kubernetes.io/projected/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-kube-api-access-dqzlm\") pod \"controller-manager-7b5d9c9d88-vmw67\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.240461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.249629 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.255654 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6" path="/var/lib/kubelet/pods/6521e0f5-cd7a-4ea9-8b9e-d8937fb952a6/volumes" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.256465 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e542f9-e9bf-424e-9d2c-852baf887b17" path="/var/lib/kubelet/pods/69e542f9-e9bf-424e-9d2c-852baf887b17/volumes" Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.497407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g"] Jan 29 03:32:45 crc kubenswrapper[4707]: I0129 03:32:45.536312 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67"] Jan 29 03:32:45 crc kubenswrapper[4707]: W0129 03:32:45.544229 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda921f4f8_58ac_44d7_ae8e_8d4f8a39081c.slice/crio-8e47e41d65b7c7b54c36662f9e14934567f10d545864f707b2563d75bdc878a2 WatchSource:0}: Error finding container 8e47e41d65b7c7b54c36662f9e14934567f10d545864f707b2563d75bdc878a2: Status 404 returned error can't find the container with id 8e47e41d65b7c7b54c36662f9e14934567f10d545864f707b2563d75bdc878a2 Jan 29 03:32:46 crc kubenswrapper[4707]: I0129 03:32:46.231525 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" event={"ID":"579e92f7-c58e-454c-9e96-d4ae38b61e7f","Type":"ContainerStarted","Data":"eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8"} Jan 29 03:32:46 crc kubenswrapper[4707]: I0129 03:32:46.231974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" event={"ID":"579e92f7-c58e-454c-9e96-d4ae38b61e7f","Type":"ContainerStarted","Data":"0f653a8625bdad04006a27bdaf024ccd19351466e445faba1dafb68538045e0f"} Jan 29 03:32:46 crc kubenswrapper[4707]: I0129 03:32:46.233950 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:46 crc kubenswrapper[4707]: I0129 03:32:46.234163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" event={"ID":"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c","Type":"ContainerStarted","Data":"07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b"} Jan 29 03:32:46 crc kubenswrapper[4707]: I0129 03:32:46.234232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" event={"ID":"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c","Type":"ContainerStarted","Data":"8e47e41d65b7c7b54c36662f9e14934567f10d545864f707b2563d75bdc878a2"} Jan 29 03:32:46 crc kubenswrapper[4707]: I0129 03:32:46.234389 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:46 crc kubenswrapper[4707]: I0129 03:32:46.238068 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:32:46 crc kubenswrapper[4707]: I0129 03:32:46.239960 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:32:46 crc kubenswrapper[4707]: I0129 03:32:46.272031 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" podStartSLOduration=3.272004028 podStartE2EDuration="3.272004028s" podCreationTimestamp="2026-01-29 03:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:32:46.253819373 +0000 UTC m=+319.738048278" watchObservedRunningTime="2026-01-29 03:32:46.272004028 +0000 UTC m=+319.756232943" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.226575 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" podStartSLOduration=20.226527492 podStartE2EDuration="20.226527492s" podCreationTimestamp="2026-01-29 03:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:32:46.299411608 +0000 UTC m=+319.783640523" watchObservedRunningTime="2026-01-29 03:33:03.226527492 +0000 UTC m=+336.710756397" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.230638 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g"] Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.230976 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" podUID="579e92f7-c58e-454c-9e96-d4ae38b61e7f" containerName="route-controller-manager" containerID="cri-o://eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8" gracePeriod=30 Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.467663 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.468235 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.703561 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.760274 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-client-ca\") pod \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.760413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nmwd\" (UniqueName: \"kubernetes.io/projected/579e92f7-c58e-454c-9e96-d4ae38b61e7f-kube-api-access-8nmwd\") pod \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.760528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/579e92f7-c58e-454c-9e96-d4ae38b61e7f-serving-cert\") pod \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.760604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-config\") pod \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\" (UID: \"579e92f7-c58e-454c-9e96-d4ae38b61e7f\") " Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.761808 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-client-ca" (OuterVolumeSpecName: "client-ca") pod "579e92f7-c58e-454c-9e96-d4ae38b61e7f" (UID: "579e92f7-c58e-454c-9e96-d4ae38b61e7f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.761867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-config" (OuterVolumeSpecName: "config") pod "579e92f7-c58e-454c-9e96-d4ae38b61e7f" (UID: "579e92f7-c58e-454c-9e96-d4ae38b61e7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.768781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/579e92f7-c58e-454c-9e96-d4ae38b61e7f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "579e92f7-c58e-454c-9e96-d4ae38b61e7f" (UID: "579e92f7-c58e-454c-9e96-d4ae38b61e7f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.771834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579e92f7-c58e-454c-9e96-d4ae38b61e7f-kube-api-access-8nmwd" (OuterVolumeSpecName: "kube-api-access-8nmwd") pod "579e92f7-c58e-454c-9e96-d4ae38b61e7f" (UID: "579e92f7-c58e-454c-9e96-d4ae38b61e7f"). InnerVolumeSpecName "kube-api-access-8nmwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.862914 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nmwd\" (UniqueName: \"kubernetes.io/projected/579e92f7-c58e-454c-9e96-d4ae38b61e7f-kube-api-access-8nmwd\") on node \"crc\" DevicePath \"\"" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.862960 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/579e92f7-c58e-454c-9e96-d4ae38b61e7f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.862974 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:33:03 crc kubenswrapper[4707]: I0129 03:33:03.862984 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/579e92f7-c58e-454c-9e96-d4ae38b61e7f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.340325 4707 generic.go:334] "Generic (PLEG): container finished" podID="579e92f7-c58e-454c-9e96-d4ae38b61e7f" containerID="eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8" exitCode=0 Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.340835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" event={"ID":"579e92f7-c58e-454c-9e96-d4ae38b61e7f","Type":"ContainerDied","Data":"eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8"} Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.340882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" event={"ID":"579e92f7-c58e-454c-9e96-d4ae38b61e7f","Type":"ContainerDied","Data":"0f653a8625bdad04006a27bdaf024ccd19351466e445faba1dafb68538045e0f"} Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.340911 4707 scope.go:117] "RemoveContainer" containerID="eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.341084 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.372652 4707 scope.go:117] "RemoveContainer" containerID="eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8" Jan 29 03:33:04 crc kubenswrapper[4707]: E0129 03:33:04.373862 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8\": container with ID starting with eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8 not found: ID does not exist" containerID="eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.374042 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8"} err="failed to get container status \"eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8\": rpc error: code = NotFound desc = could not find container \"eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8\": container with ID starting with eaa1e4f29f390a8a4745f79b3ba3e37ac2dee576a1cb845199f9a29a383fcab8 not found: ID does not exist" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.384822 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g"] Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.388238 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b6fc7b95-ddz4g"] Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.923372 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt"] Jan 29 03:33:04 crc kubenswrapper[4707]: E0129 03:33:04.923643 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579e92f7-c58e-454c-9e96-d4ae38b61e7f" containerName="route-controller-manager" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.923660 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="579e92f7-c58e-454c-9e96-d4ae38b61e7f" containerName="route-controller-manager" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.923784 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="579e92f7-c58e-454c-9e96-d4ae38b61e7f" containerName="route-controller-manager" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.924331 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.927437 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.927634 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.927635 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.927661 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.928322 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.929525 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.944264 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt"] Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.980226 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8dhj\" (UniqueName: \"kubernetes.io/projected/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-kube-api-access-p8dhj\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.980318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-client-ca\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.980444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-config\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:04 crc kubenswrapper[4707]: I0129 03:33:04.980524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-serving-cert\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.081622 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-serving-cert\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.081821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8dhj\" (UniqueName: \"kubernetes.io/projected/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-kube-api-access-p8dhj\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.081873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-client-ca\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.081914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-config\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.083936 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-client-ca\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.084283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-config\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.090217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-serving-cert\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.100184 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8dhj\" (UniqueName: \"kubernetes.io/projected/eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf-kube-api-access-p8dhj\") pod \"route-controller-manager-57b96685c4-4rlqt\" (UID: \"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf\") " pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.247858 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.253419 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579e92f7-c58e-454c-9e96-d4ae38b61e7f" path="/var/lib/kubelet/pods/579e92f7-c58e-454c-9e96-d4ae38b61e7f/volumes" Jan 29 03:33:05 crc kubenswrapper[4707]: I0129 03:33:05.679861 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt"] Jan 29 03:33:05 crc kubenswrapper[4707]: W0129 03:33:05.691126 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb8ed281_aced_4a47_9b4e_9a86a4cf4bbf.slice/crio-b1f4e789b07df704ab8053cf18622c70f6c594b7d5ed4d9e5188dc11bf170404 WatchSource:0}: Error finding container b1f4e789b07df704ab8053cf18622c70f6c594b7d5ed4d9e5188dc11bf170404: Status 404 returned error can't find the container with id b1f4e789b07df704ab8053cf18622c70f6c594b7d5ed4d9e5188dc11bf170404 Jan 29 03:33:06 crc kubenswrapper[4707]: I0129 03:33:06.357930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" event={"ID":"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf","Type":"ContainerStarted","Data":"badc7450ccf22bcb1e02e28715d0a1099470c1f7b7e4224d8c748a2e89def1c9"} Jan 29 03:33:06 crc kubenswrapper[4707]: I0129 03:33:06.358003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" event={"ID":"eb8ed281-aced-4a47-9b4e-9a86a4cf4bbf","Type":"ContainerStarted","Data":"b1f4e789b07df704ab8053cf18622c70f6c594b7d5ed4d9e5188dc11bf170404"} Jan 29 03:33:06 crc kubenswrapper[4707]: I0129 03:33:06.358238 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:06 crc kubenswrapper[4707]: I0129 03:33:06.363408 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" Jan 29 03:33:06 crc kubenswrapper[4707]: I0129 03:33:06.376817 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57b96685c4-4rlqt" podStartSLOduration=3.376790785 podStartE2EDuration="3.376790785s" podCreationTimestamp="2026-01-29 03:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:33:06.374639294 +0000 UTC m=+339.858868209" watchObservedRunningTime="2026-01-29 03:33:06.376790785 +0000 UTC m=+339.861019690" Jan 29 03:33:26 crc kubenswrapper[4707]: I0129 03:33:26.902023 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jgwqj"] Jan 29 03:33:26 crc kubenswrapper[4707]: I0129 03:33:26.903649 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:26 crc kubenswrapper[4707]: I0129 03:33:26.918823 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jgwqj"] Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.009994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.010072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/87f68df3-f730-4203-a6a3-d7c4f089ee37-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.010167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/87f68df3-f730-4203-a6a3-d7c4f089ee37-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.010195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxnh8\" (UniqueName: \"kubernetes.io/projected/87f68df3-f730-4203-a6a3-d7c4f089ee37-kube-api-access-vxnh8\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.010223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/87f68df3-f730-4203-a6a3-d7c4f089ee37-registry-tls\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.010243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/87f68df3-f730-4203-a6a3-d7c4f089ee37-bound-sa-token\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.010262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/87f68df3-f730-4203-a6a3-d7c4f089ee37-registry-certificates\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.010300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87f68df3-f730-4203-a6a3-d7c4f089ee37-trusted-ca\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.052946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.111686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/87f68df3-f730-4203-a6a3-d7c4f089ee37-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.111769 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxnh8\" (UniqueName: \"kubernetes.io/projected/87f68df3-f730-4203-a6a3-d7c4f089ee37-kube-api-access-vxnh8\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.111799 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/87f68df3-f730-4203-a6a3-d7c4f089ee37-registry-tls\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.111816 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/87f68df3-f730-4203-a6a3-d7c4f089ee37-bound-sa-token\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.111838 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/87f68df3-f730-4203-a6a3-d7c4f089ee37-registry-certificates\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.111888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87f68df3-f730-4203-a6a3-d7c4f089ee37-trusted-ca\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.111915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/87f68df3-f730-4203-a6a3-d7c4f089ee37-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.113377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/87f68df3-f730-4203-a6a3-d7c4f089ee37-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.114370 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87f68df3-f730-4203-a6a3-d7c4f089ee37-trusted-ca\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.114977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/87f68df3-f730-4203-a6a3-d7c4f089ee37-registry-certificates\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.124854 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/87f68df3-f730-4203-a6a3-d7c4f089ee37-registry-tls\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.125503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/87f68df3-f730-4203-a6a3-d7c4f089ee37-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.128918 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/87f68df3-f730-4203-a6a3-d7c4f089ee37-bound-sa-token\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.129107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxnh8\" (UniqueName: \"kubernetes.io/projected/87f68df3-f730-4203-a6a3-d7c4f089ee37-kube-api-access-vxnh8\") pod \"image-registry-66df7c8f76-jgwqj\" (UID: \"87f68df3-f730-4203-a6a3-d7c4f089ee37\") " pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.225565 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:27 crc kubenswrapper[4707]: I0129 03:33:27.655775 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jgwqj"] Jan 29 03:33:27 crc kubenswrapper[4707]: W0129 03:33:27.658954 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f68df3_f730_4203_a6a3_d7c4f089ee37.slice/crio-eff48cadf7b5347c1eb964db80a5b1402a3de263ba381ab2823181b3f854a1cc WatchSource:0}: Error finding container eff48cadf7b5347c1eb964db80a5b1402a3de263ba381ab2823181b3f854a1cc: Status 404 returned error can't find the container with id eff48cadf7b5347c1eb964db80a5b1402a3de263ba381ab2823181b3f854a1cc Jan 29 03:33:28 crc kubenswrapper[4707]: I0129 03:33:28.628578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" event={"ID":"87f68df3-f730-4203-a6a3-d7c4f089ee37","Type":"ContainerStarted","Data":"17fed76f0e0f83a2495f0f422fb513b2d2439e4273ef473b60432df203263d94"} Jan 29 03:33:28 crc kubenswrapper[4707]: I0129 03:33:28.628637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" event={"ID":"87f68df3-f730-4203-a6a3-d7c4f089ee37","Type":"ContainerStarted","Data":"eff48cadf7b5347c1eb964db80a5b1402a3de263ba381ab2823181b3f854a1cc"} Jan 29 03:33:28 crc kubenswrapper[4707]: I0129 03:33:28.629155 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:28 crc kubenswrapper[4707]: I0129 03:33:28.672258 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" podStartSLOduration=2.672236083 podStartE2EDuration="2.672236083s" podCreationTimestamp="2026-01-29 03:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:33:28.669070013 +0000 UTC m=+362.153298958" watchObservedRunningTime="2026-01-29 03:33:28.672236083 +0000 UTC m=+362.156464988" Jan 29 03:33:33 crc kubenswrapper[4707]: I0129 03:33:33.463106 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:33:33 crc kubenswrapper[4707]: I0129 03:33:33.464082 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.210047 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67"] Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.211130 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" podUID="a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" containerName="controller-manager" containerID="cri-o://07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b" gracePeriod=30 Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.621788 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.692305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-client-ca\") pod \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.692461 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-config\") pod \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.692565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-serving-cert\") pod \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.692660 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-proxy-ca-bundles\") pod \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.692715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqzlm\" (UniqueName: \"kubernetes.io/projected/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-kube-api-access-dqzlm\") pod \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\" (UID: \"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c\") " Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.693801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-config" (OuterVolumeSpecName: "config") pod "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" (UID: "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.693834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" (UID: "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.693835 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-client-ca" (OuterVolumeSpecName: "client-ca") pod "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" (UID: "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.700714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" (UID: "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.701905 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-kube-api-access-dqzlm" (OuterVolumeSpecName: "kube-api-access-dqzlm") pod "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" (UID: "a921f4f8-58ac-44d7-ae8e-8d4f8a39081c"). InnerVolumeSpecName "kube-api-access-dqzlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.729910 4707 generic.go:334] "Generic (PLEG): container finished" podID="a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" containerID="07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b" exitCode=0 Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.729974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" event={"ID":"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c","Type":"ContainerDied","Data":"07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b"} Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.730012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" event={"ID":"a921f4f8-58ac-44d7-ae8e-8d4f8a39081c","Type":"ContainerDied","Data":"8e47e41d65b7c7b54c36662f9e14934567f10d545864f707b2563d75bdc878a2"} Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.730033 4707 scope.go:117] "RemoveContainer" containerID="07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.730172 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.781382 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67"] Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.786444 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b5d9c9d88-vmw67"] Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.790264 4707 scope.go:117] "RemoveContainer" containerID="07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b" Jan 29 03:33:43 crc kubenswrapper[4707]: E0129 03:33:43.791079 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b\": container with ID starting with 07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b not found: ID does not exist" containerID="07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.791271 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b"} err="failed to get container status \"07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b\": rpc error: code = NotFound desc = could not find container \"07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b\": container with ID starting with 07b58f946d78a8f83a9ee6ed9e9eaaf53c80530b38a0c7bda630db9039ea560b not found: ID does not exist" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.793747 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.793769 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.793782 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqzlm\" (UniqueName: \"kubernetes.io/projected/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-kube-api-access-dqzlm\") on node \"crc\" DevicePath \"\"" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.793791 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:33:43 crc kubenswrapper[4707]: I0129 03:33:43.793801 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.966779 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv"] Jan 29 03:33:44 crc kubenswrapper[4707]: E0129 03:33:44.967475 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" containerName="controller-manager" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.967491 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" containerName="controller-manager" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.967650 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" containerName="controller-manager" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.968246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.971094 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.971523 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.971796 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.971832 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.972277 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.973978 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.987081 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv"] Jan 29 03:33:44 crc kubenswrapper[4707]: I0129 03:33:44.988103 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.009174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c832f344-ee7b-4c0e-937b-951569eb481f-config\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.009242 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kgpr\" (UniqueName: \"kubernetes.io/projected/c832f344-ee7b-4c0e-937b-951569eb481f-kube-api-access-7kgpr\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.009316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c832f344-ee7b-4c0e-937b-951569eb481f-proxy-ca-bundles\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.009349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c832f344-ee7b-4c0e-937b-951569eb481f-client-ca\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.009374 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c832f344-ee7b-4c0e-937b-951569eb481f-serving-cert\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.110341 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c832f344-ee7b-4c0e-937b-951569eb481f-config\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.112028 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c832f344-ee7b-4c0e-937b-951569eb481f-config\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.112095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kgpr\" (UniqueName: \"kubernetes.io/projected/c832f344-ee7b-4c0e-937b-951569eb481f-kube-api-access-7kgpr\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.112271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c832f344-ee7b-4c0e-937b-951569eb481f-proxy-ca-bundles\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.113068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c832f344-ee7b-4c0e-937b-951569eb481f-proxy-ca-bundles\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.113154 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c832f344-ee7b-4c0e-937b-951569eb481f-client-ca\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.113195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c832f344-ee7b-4c0e-937b-951569eb481f-serving-cert\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.113853 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c832f344-ee7b-4c0e-937b-951569eb481f-client-ca\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.121314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c832f344-ee7b-4c0e-937b-951569eb481f-serving-cert\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.130423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kgpr\" (UniqueName: \"kubernetes.io/projected/c832f344-ee7b-4c0e-937b-951569eb481f-kube-api-access-7kgpr\") pod \"controller-manager-65c9b5b4f6-gzwkv\" (UID: \"c832f344-ee7b-4c0e-937b-951569eb481f\") " pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.252154 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a921f4f8-58ac-44d7-ae8e-8d4f8a39081c" path="/var/lib/kubelet/pods/a921f4f8-58ac-44d7-ae8e-8d4f8a39081c/volumes" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.288785 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.699884 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv"] Jan 29 03:33:45 crc kubenswrapper[4707]: I0129 03:33:45.746390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" event={"ID":"c832f344-ee7b-4c0e-937b-951569eb481f","Type":"ContainerStarted","Data":"fa1f3b00721472b9469e5978fdeb5e60063b425d0905b35d40c17c7dc18b5c51"} Jan 29 03:33:46 crc kubenswrapper[4707]: I0129 03:33:46.754498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" event={"ID":"c832f344-ee7b-4c0e-937b-951569eb481f","Type":"ContainerStarted","Data":"038ac7ae37d472a5f6efa2ffcaed30db8982f4976d58b50da253dc973ef0d936"} Jan 29 03:33:46 crc kubenswrapper[4707]: I0129 03:33:46.754997 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:46 crc kubenswrapper[4707]: I0129 03:33:46.758883 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" Jan 29 03:33:46 crc kubenswrapper[4707]: I0129 03:33:46.779387 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65c9b5b4f6-gzwkv" podStartSLOduration=3.779349814 podStartE2EDuration="3.779349814s" podCreationTimestamp="2026-01-29 03:33:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:33:46.773005245 +0000 UTC m=+380.257234150" watchObservedRunningTime="2026-01-29 03:33:46.779349814 +0000 UTC m=+380.263578719" Jan 29 03:33:47 crc kubenswrapper[4707]: I0129 03:33:47.231030 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jgwqj" Jan 29 03:33:47 crc kubenswrapper[4707]: I0129 03:33:47.300111 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z575b"] Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.797958 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nhjg7"] Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.799292 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nhjg7" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" containerName="registry-server" containerID="cri-o://51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473" gracePeriod=30 Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.805701 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xrvfw"] Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.806083 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xrvfw" podUID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerName="registry-server" containerID="cri-o://399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b" gracePeriod=30 Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.814423 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tpsmp"] Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.814816 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" podUID="eeef9237-bd0b-494d-a3a0-8b6e54baa03e" containerName="marketplace-operator" containerID="cri-o://7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8" gracePeriod=30 Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.837523 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hb89x"] Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.838004 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hb89x" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerName="registry-server" containerID="cri-o://c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a" gracePeriod=30 Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.846682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxdsw"] Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.847136 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wxdsw" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerName="registry-server" containerID="cri-o://67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6" gracePeriod=30 Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.861380 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qwgfr"] Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.862483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:02 crc kubenswrapper[4707]: I0129 03:34:02.867302 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qwgfr"] Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.003814 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d3599142-c844-4a86-9bef-e589d69f0ef4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qwgfr\" (UID: \"d3599142-c844-4a86-9bef-e589d69f0ef4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.003867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3599142-c844-4a86-9bef-e589d69f0ef4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qwgfr\" (UID: \"d3599142-c844-4a86-9bef-e589d69f0ef4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.003993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96bzg\" (UniqueName: \"kubernetes.io/projected/d3599142-c844-4a86-9bef-e589d69f0ef4-kube-api-access-96bzg\") pod \"marketplace-operator-79b997595-qwgfr\" (UID: \"d3599142-c844-4a86-9bef-e589d69f0ef4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.105338 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96bzg\" (UniqueName: \"kubernetes.io/projected/d3599142-c844-4a86-9bef-e589d69f0ef4-kube-api-access-96bzg\") pod \"marketplace-operator-79b997595-qwgfr\" (UID: \"d3599142-c844-4a86-9bef-e589d69f0ef4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.105625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d3599142-c844-4a86-9bef-e589d69f0ef4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qwgfr\" (UID: \"d3599142-c844-4a86-9bef-e589d69f0ef4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.105728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3599142-c844-4a86-9bef-e589d69f0ef4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qwgfr\" (UID: \"d3599142-c844-4a86-9bef-e589d69f0ef4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.107763 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3599142-c844-4a86-9bef-e589d69f0ef4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qwgfr\" (UID: \"d3599142-c844-4a86-9bef-e589d69f0ef4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.116601 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d3599142-c844-4a86-9bef-e589d69f0ef4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qwgfr\" (UID: \"d3599142-c844-4a86-9bef-e589d69f0ef4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.124912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96bzg\" (UniqueName: \"kubernetes.io/projected/d3599142-c844-4a86-9bef-e589d69f0ef4-kube-api-access-96bzg\") pod \"marketplace-operator-79b997595-qwgfr\" (UID: \"d3599142-c844-4a86-9bef-e589d69f0ef4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.268175 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.351148 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.463337 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.463410 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.463474 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.464261 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2018b8d36afa3f2a5c920f93a22bd21150b05028d7be0b59b8d8babfd9ed3779"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.464334 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://2018b8d36afa3f2a5c920f93a22bd21150b05028d7be0b59b8d8babfd9ed3779" gracePeriod=600 Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.512499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-568r6\" (UniqueName: \"kubernetes.io/projected/b4d4ff70-611c-4a65-982c-f551baa66bd5-kube-api-access-568r6\") pod \"b4d4ff70-611c-4a65-982c-f551baa66bd5\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.512620 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-catalog-content\") pod \"b4d4ff70-611c-4a65-982c-f551baa66bd5\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.512705 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-utilities\") pod \"b4d4ff70-611c-4a65-982c-f551baa66bd5\" (UID: \"b4d4ff70-611c-4a65-982c-f551baa66bd5\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.514023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-utilities" (OuterVolumeSpecName: "utilities") pod "b4d4ff70-611c-4a65-982c-f551baa66bd5" (UID: "b4d4ff70-611c-4a65-982c-f551baa66bd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.520732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d4ff70-611c-4a65-982c-f551baa66bd5-kube-api-access-568r6" (OuterVolumeSpecName: "kube-api-access-568r6") pod "b4d4ff70-611c-4a65-982c-f551baa66bd5" (UID: "b4d4ff70-611c-4a65-982c-f551baa66bd5"). InnerVolumeSpecName "kube-api-access-568r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.586307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4d4ff70-611c-4a65-982c-f551baa66bd5" (UID: "b4d4ff70-611c-4a65-982c-f551baa66bd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.597467 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.612478 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.617395 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.617496 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d4ff70-611c-4a65-982c-f551baa66bd5-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.617510 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-568r6\" (UniqueName: \"kubernetes.io/projected/b4d4ff70-611c-4a65-982c-f551baa66bd5-kube-api-access-568r6\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.622745 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.627505 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.718570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llwfv\" (UniqueName: \"kubernetes.io/projected/a832dac2-976f-45e7-adc9-fc29666d0721-kube-api-access-llwfv\") pod \"a832dac2-976f-45e7-adc9-fc29666d0721\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-utilities\") pod \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-operator-metrics\") pod \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmxt7\" (UniqueName: \"kubernetes.io/projected/ad1e990a-db38-4eb8-8ae9-6bd700728e48-kube-api-access-fmxt7\") pod \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-catalog-content\") pod \"a832dac2-976f-45e7-adc9-fc29666d0721\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-catalog-content\") pod \"c814051a-bbf1-4219-8089-8124cb1d3b7b\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-utilities\") pod \"a832dac2-976f-45e7-adc9-fc29666d0721\" (UID: \"a832dac2-976f-45e7-adc9-fc29666d0721\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v2zk\" (UniqueName: \"kubernetes.io/projected/c814051a-bbf1-4219-8089-8124cb1d3b7b-kube-api-access-4v2zk\") pod \"c814051a-bbf1-4219-8089-8124cb1d3b7b\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-utilities\") pod \"c814051a-bbf1-4219-8089-8124cb1d3b7b\" (UID: \"c814051a-bbf1-4219-8089-8124cb1d3b7b\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs2n5\" (UniqueName: \"kubernetes.io/projected/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-kube-api-access-rs2n5\") pod \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-trusted-ca\") pod \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\" (UID: \"eeef9237-bd0b-494d-a3a0-8b6e54baa03e\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.719530 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-catalog-content\") pod \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\" (UID: \"ad1e990a-db38-4eb8-8ae9-6bd700728e48\") " Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.733422 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-utilities" (OuterVolumeSpecName: "utilities") pod "a832dac2-976f-45e7-adc9-fc29666d0721" (UID: "a832dac2-976f-45e7-adc9-fc29666d0721"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.734685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-utilities" (OuterVolumeSpecName: "utilities") pod "ad1e990a-db38-4eb8-8ae9-6bd700728e48" (UID: "ad1e990a-db38-4eb8-8ae9-6bd700728e48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.735125 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "eeef9237-bd0b-494d-a3a0-8b6e54baa03e" (UID: "eeef9237-bd0b-494d-a3a0-8b6e54baa03e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.736875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-utilities" (OuterVolumeSpecName: "utilities") pod "c814051a-bbf1-4219-8089-8124cb1d3b7b" (UID: "c814051a-bbf1-4219-8089-8124cb1d3b7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.738608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a832dac2-976f-45e7-adc9-fc29666d0721-kube-api-access-llwfv" (OuterVolumeSpecName: "kube-api-access-llwfv") pod "a832dac2-976f-45e7-adc9-fc29666d0721" (UID: "a832dac2-976f-45e7-adc9-fc29666d0721"). InnerVolumeSpecName "kube-api-access-llwfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.741143 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-kube-api-access-rs2n5" (OuterVolumeSpecName: "kube-api-access-rs2n5") pod "eeef9237-bd0b-494d-a3a0-8b6e54baa03e" (UID: "eeef9237-bd0b-494d-a3a0-8b6e54baa03e"). InnerVolumeSpecName "kube-api-access-rs2n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.752610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c814051a-bbf1-4219-8089-8124cb1d3b7b-kube-api-access-4v2zk" (OuterVolumeSpecName: "kube-api-access-4v2zk") pod "c814051a-bbf1-4219-8089-8124cb1d3b7b" (UID: "c814051a-bbf1-4219-8089-8124cb1d3b7b"). InnerVolumeSpecName "kube-api-access-4v2zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.754131 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1e990a-db38-4eb8-8ae9-6bd700728e48-kube-api-access-fmxt7" (OuterVolumeSpecName: "kube-api-access-fmxt7") pod "ad1e990a-db38-4eb8-8ae9-6bd700728e48" (UID: "ad1e990a-db38-4eb8-8ae9-6bd700728e48"). InnerVolumeSpecName "kube-api-access-fmxt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.757064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "eeef9237-bd0b-494d-a3a0-8b6e54baa03e" (UID: "eeef9237-bd0b-494d-a3a0-8b6e54baa03e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.777738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad1e990a-db38-4eb8-8ae9-6bd700728e48" (UID: "ad1e990a-db38-4eb8-8ae9-6bd700728e48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.795494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a832dac2-976f-45e7-adc9-fc29666d0721" (UID: "a832dac2-976f-45e7-adc9-fc29666d0721"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822479 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822515 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs2n5\" (UniqueName: \"kubernetes.io/projected/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-kube-api-access-rs2n5\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822528 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822553 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822563 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llwfv\" (UniqueName: \"kubernetes.io/projected/a832dac2-976f-45e7-adc9-fc29666d0721-kube-api-access-llwfv\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822573 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1e990a-db38-4eb8-8ae9-6bd700728e48-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822582 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eeef9237-bd0b-494d-a3a0-8b6e54baa03e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822593 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmxt7\" (UniqueName: \"kubernetes.io/projected/ad1e990a-db38-4eb8-8ae9-6bd700728e48-kube-api-access-fmxt7\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822601 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822610 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a832dac2-976f-45e7-adc9-fc29666d0721-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.822619 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v2zk\" (UniqueName: \"kubernetes.io/projected/c814051a-bbf1-4219-8089-8124cb1d3b7b-kube-api-access-4v2zk\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.843439 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qwgfr"] Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.888622 4707 generic.go:334] "Generic (PLEG): container finished" podID="a832dac2-976f-45e7-adc9-fc29666d0721" containerID="51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473" exitCode=0 Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.888695 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhjg7" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.888710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjg7" event={"ID":"a832dac2-976f-45e7-adc9-fc29666d0721","Type":"ContainerDied","Data":"51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.888806 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjg7" event={"ID":"a832dac2-976f-45e7-adc9-fc29666d0721","Type":"ContainerDied","Data":"9bb06d26a551ddb3d1ff3c417b44f38f47b2b8451e9cc22b56cc6733b9a1bd6d"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.888834 4707 scope.go:117] "RemoveContainer" containerID="51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.892068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c814051a-bbf1-4219-8089-8124cb1d3b7b" (UID: "c814051a-bbf1-4219-8089-8124cb1d3b7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.895217 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="2018b8d36afa3f2a5c920f93a22bd21150b05028d7be0b59b8d8babfd9ed3779" exitCode=0 Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.895253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"2018b8d36afa3f2a5c920f93a22bd21150b05028d7be0b59b8d8babfd9ed3779"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.895305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"7d626ab43efa55857c89f32b874520ad65ab395b5a2359de01e47becbb927c08"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.899377 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerID="399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b" exitCode=0 Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.899424 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrvfw" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.899454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrvfw" event={"ID":"b4d4ff70-611c-4a65-982c-f551baa66bd5","Type":"ContainerDied","Data":"399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.899480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrvfw" event={"ID":"b4d4ff70-611c-4a65-982c-f551baa66bd5","Type":"ContainerDied","Data":"8040e8d500b2719590accb1880bb964170067879cf740a47dab9641e2a62f7f8"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.903670 4707 generic.go:334] "Generic (PLEG): container finished" podID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerID="67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6" exitCode=0 Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.903782 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxdsw" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.903809 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxdsw" event={"ID":"c814051a-bbf1-4219-8089-8124cb1d3b7b","Type":"ContainerDied","Data":"67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.903864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxdsw" event={"ID":"c814051a-bbf1-4219-8089-8124cb1d3b7b","Type":"ContainerDied","Data":"1252384938d33057344238dc2bb7a6b8f818e385184fa2732bd805ed778fe8ee"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.907031 4707 generic.go:334] "Generic (PLEG): container finished" podID="eeef9237-bd0b-494d-a3a0-8b6e54baa03e" containerID="7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8" exitCode=0 Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.907093 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" event={"ID":"eeef9237-bd0b-494d-a3a0-8b6e54baa03e","Type":"ContainerDied","Data":"7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.907118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" event={"ID":"eeef9237-bd0b-494d-a3a0-8b6e54baa03e","Type":"ContainerDied","Data":"da18ab320b74de2f8687ed0e9e014d3db3c41c3a29228cf55d1003b8bff24c3e"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.907175 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tpsmp" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.910845 4707 generic.go:334] "Generic (PLEG): container finished" podID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerID="c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a" exitCode=0 Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.910926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hb89x" event={"ID":"ad1e990a-db38-4eb8-8ae9-6bd700728e48","Type":"ContainerDied","Data":"c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.910963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hb89x" event={"ID":"ad1e990a-db38-4eb8-8ae9-6bd700728e48","Type":"ContainerDied","Data":"2c854bc9ddb6166e890397dfe4cc7a80d5d5488b307effcebac9de3d33c25899"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.911072 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hb89x" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.912357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" event={"ID":"d3599142-c844-4a86-9bef-e589d69f0ef4","Type":"ContainerStarted","Data":"648ee4c07267f2ce748a429afdfd4f54c380ab65391b0137dbca20ddc7bc6bc1"} Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.914712 4707 scope.go:117] "RemoveContainer" containerID="5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.925009 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c814051a-bbf1-4219-8089-8124cb1d3b7b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.947012 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nhjg7"] Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.951323 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nhjg7"] Jan 29 03:34:03 crc kubenswrapper[4707]: I0129 03:34:03.963045 4707 scope.go:117] "RemoveContainer" containerID="aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.000153 4707 scope.go:117] "RemoveContainer" containerID="51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.004564 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473\": container with ID starting with 51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473 not found: ID does not exist" containerID="51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.004598 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473"} err="failed to get container status \"51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473\": rpc error: code = NotFound desc = could not find container \"51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473\": container with ID starting with 51337fb8d467961c6f78e0998a3d81d71013fb74c13df3e3ce8101fcb80f3473 not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.004631 4707 scope.go:117] "RemoveContainer" containerID="5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.005233 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59\": container with ID starting with 5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59 not found: ID does not exist" containerID="5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.005292 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59"} err="failed to get container status \"5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59\": rpc error: code = NotFound desc = could not find container \"5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59\": container with ID starting with 5c89078ba8a39db8d80ce652d493787255f6d0a853e0454beda6bdf69b6b4e59 not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.005311 4707 scope.go:117] "RemoveContainer" containerID="aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.005762 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf\": container with ID starting with aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf not found: ID does not exist" containerID="aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.005788 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf"} err="failed to get container status \"aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf\": rpc error: code = NotFound desc = could not find container \"aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf\": container with ID starting with aa9b40ac9d9a2b0a13133553d4a471992c0749e3cb5c39d662cebdcde5319fcf not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.005806 4707 scope.go:117] "RemoveContainer" containerID="c116e23b2a283672e9ca77f1af4737cb856c5440e6347191c29040696e4f4b93" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.037818 4707 scope.go:117] "RemoveContainer" containerID="399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.048418 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xrvfw"] Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.078427 4707 scope.go:117] "RemoveContainer" containerID="da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.081837 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xrvfw"] Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.096789 4707 scope.go:117] "RemoveContainer" containerID="ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.101347 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tpsmp"] Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.106092 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tpsmp"] Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.111054 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxdsw"] Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.114663 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wxdsw"] Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.118575 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hb89x"] Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.121565 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hb89x"] Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.128528 4707 scope.go:117] "RemoveContainer" containerID="399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.129701 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b\": container with ID starting with 399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b not found: ID does not exist" containerID="399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.129747 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b"} err="failed to get container status \"399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b\": rpc error: code = NotFound desc = could not find container \"399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b\": container with ID starting with 399195b511a60a32d4bee02bdfc07fffb6717b8adc786725e203b5967f421c6b not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.129782 4707 scope.go:117] "RemoveContainer" containerID="da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.130280 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60\": container with ID starting with da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60 not found: ID does not exist" containerID="da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.130324 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60"} err="failed to get container status \"da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60\": rpc error: code = NotFound desc = could not find container \"da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60\": container with ID starting with da67e90e1120148cb199b52ff53eaf072a292c7d2e46ad720accee6e73442a60 not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.130355 4707 scope.go:117] "RemoveContainer" containerID="ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.130645 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5\": container with ID starting with ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5 not found: ID does not exist" containerID="ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.130674 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5"} err="failed to get container status \"ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5\": rpc error: code = NotFound desc = could not find container \"ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5\": container with ID starting with ff518e3de3926805464d505b6dfb2e2b0cef5ae5de68db945b9a58a506cbdfe5 not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.130729 4707 scope.go:117] "RemoveContainer" containerID="67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.145257 4707 scope.go:117] "RemoveContainer" containerID="1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.162063 4707 scope.go:117] "RemoveContainer" containerID="8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.178018 4707 scope.go:117] "RemoveContainer" containerID="67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.178413 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6\": container with ID starting with 67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6 not found: ID does not exist" containerID="67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.178454 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6"} err="failed to get container status \"67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6\": rpc error: code = NotFound desc = could not find container \"67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6\": container with ID starting with 67a56711b90bfb6cd70646795ea497cd25b51b5c10b07a32607ba0a4d35bc1b6 not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.178480 4707 scope.go:117] "RemoveContainer" containerID="1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.178813 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d\": container with ID starting with 1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d not found: ID does not exist" containerID="1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.178832 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d"} err="failed to get container status \"1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d\": rpc error: code = NotFound desc = could not find container \"1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d\": container with ID starting with 1a3f8d93865efceb6aee84babe2f631d21b5d7c5f80b88404ef98a3b9c7da70d not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.178845 4707 scope.go:117] "RemoveContainer" containerID="8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.179037 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49\": container with ID starting with 8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49 not found: ID does not exist" containerID="8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.179075 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49"} err="failed to get container status \"8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49\": rpc error: code = NotFound desc = could not find container \"8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49\": container with ID starting with 8c2ab4243301e0406aaac645487bb6c43a62cef6e2e607dfe12a9cbb53462f49 not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.179087 4707 scope.go:117] "RemoveContainer" containerID="7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.194779 4707 scope.go:117] "RemoveContainer" containerID="7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.195512 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8\": container with ID starting with 7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8 not found: ID does not exist" containerID="7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.195594 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8"} err="failed to get container status \"7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8\": rpc error: code = NotFound desc = could not find container \"7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8\": container with ID starting with 7d5c80f774606286027538e23a50958dc5aa4ff57d71052ec406273ea4b705f8 not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.195647 4707 scope.go:117] "RemoveContainer" containerID="c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.212080 4707 scope.go:117] "RemoveContainer" containerID="eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.230899 4707 scope.go:117] "RemoveContainer" containerID="104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.244821 4707 scope.go:117] "RemoveContainer" containerID="c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.245429 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a\": container with ID starting with c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a not found: ID does not exist" containerID="c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.245466 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a"} err="failed to get container status \"c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a\": rpc error: code = NotFound desc = could not find container \"c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a\": container with ID starting with c7d161f901427d0dd002dcffc07dd04a5061d02366089d5f4a6f0cc847deab4a not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.245496 4707 scope.go:117] "RemoveContainer" containerID="eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.245955 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db\": container with ID starting with eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db not found: ID does not exist" containerID="eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.245995 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db"} err="failed to get container status \"eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db\": rpc error: code = NotFound desc = could not find container \"eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db\": container with ID starting with eb222676f9985f299a1499de71160ba9ae0495cb9772a0e1d8b1a894f79947db not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.246023 4707 scope.go:117] "RemoveContainer" containerID="104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c" Jan 29 03:34:04 crc kubenswrapper[4707]: E0129 03:34:04.246308 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c\": container with ID starting with 104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c not found: ID does not exist" containerID="104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.246339 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c"} err="failed to get container status \"104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c\": rpc error: code = NotFound desc = could not find container \"104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c\": container with ID starting with 104fa0e1df5ad993182b2e147a49f0293c56e4660545a6b745fc0f783edf533c not found: ID does not exist" Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.936746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" event={"ID":"d3599142-c844-4a86-9bef-e589d69f0ef4","Type":"ContainerStarted","Data":"606eda0c6a9c0862beb3c7ce34752d8bc97d0de5ec6031768c79c907b705d67b"} Jan 29 03:34:04 crc kubenswrapper[4707]: I0129 03:34:04.957414 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" podStartSLOduration=2.957382188 podStartE2EDuration="2.957382188s" podCreationTimestamp="2026-01-29 03:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:34:04.955460134 +0000 UTC m=+398.439689029" watchObservedRunningTime="2026-01-29 03:34:04.957382188 +0000 UTC m=+398.441611103" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.014303 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2wrp2"] Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.014589 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerName="extract-utilities" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.014604 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerName="extract-utilities" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.014654 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerName="extract-content" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.014663 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerName="extract-content" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.014672 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.014678 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.014686 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" containerName="extract-utilities" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.014692 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" containerName="extract-utilities" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.014698 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.014704 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.014743 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.014751 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.014973 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerName="extract-utilities" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.014979 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerName="extract-utilities" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.014989 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerName="extract-content" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.014994 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerName="extract-content" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.015001 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeef9237-bd0b-494d-a3a0-8b6e54baa03e" containerName="marketplace-operator" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015011 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeef9237-bd0b-494d-a3a0-8b6e54baa03e" containerName="marketplace-operator" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.015043 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerName="extract-content" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015050 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerName="extract-content" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.015060 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerName="extract-utilities" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015065 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerName="extract-utilities" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.015073 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" containerName="extract-content" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015079 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" containerName="extract-content" Jan 29 03:34:05 crc kubenswrapper[4707]: E0129 03:34:05.015089 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015115 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015249 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d4ff70-611c-4a65-982c-f551baa66bd5" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015333 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeef9237-bd0b-494d-a3a0-8b6e54baa03e" containerName="marketplace-operator" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015341 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015349 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.015356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" containerName="registry-server" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.017351 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.020717 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.021474 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2wrp2"] Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.142811 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jzxk\" (UniqueName: \"kubernetes.io/projected/87614809-814f-41aa-a98f-8d06b5875cd7-kube-api-access-9jzxk\") pod \"certified-operators-2wrp2\" (UID: \"87614809-814f-41aa-a98f-8d06b5875cd7\") " pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.142923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87614809-814f-41aa-a98f-8d06b5875cd7-utilities\") pod \"certified-operators-2wrp2\" (UID: \"87614809-814f-41aa-a98f-8d06b5875cd7\") " pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.143026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87614809-814f-41aa-a98f-8d06b5875cd7-catalog-content\") pod \"certified-operators-2wrp2\" (UID: \"87614809-814f-41aa-a98f-8d06b5875cd7\") " pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.209954 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fhkcz"] Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.212737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.215988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.219291 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhkcz"] Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.243982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87614809-814f-41aa-a98f-8d06b5875cd7-catalog-content\") pod \"certified-operators-2wrp2\" (UID: \"87614809-814f-41aa-a98f-8d06b5875cd7\") " pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.244105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jzxk\" (UniqueName: \"kubernetes.io/projected/87614809-814f-41aa-a98f-8d06b5875cd7-kube-api-access-9jzxk\") pod \"certified-operators-2wrp2\" (UID: \"87614809-814f-41aa-a98f-8d06b5875cd7\") " pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.244149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87614809-814f-41aa-a98f-8d06b5875cd7-utilities\") pod \"certified-operators-2wrp2\" (UID: \"87614809-814f-41aa-a98f-8d06b5875cd7\") " pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.244944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87614809-814f-41aa-a98f-8d06b5875cd7-utilities\") pod \"certified-operators-2wrp2\" (UID: \"87614809-814f-41aa-a98f-8d06b5875cd7\") " pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.245114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87614809-814f-41aa-a98f-8d06b5875cd7-catalog-content\") pod \"certified-operators-2wrp2\" (UID: \"87614809-814f-41aa-a98f-8d06b5875cd7\") " pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.251440 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a832dac2-976f-45e7-adc9-fc29666d0721" path="/var/lib/kubelet/pods/a832dac2-976f-45e7-adc9-fc29666d0721/volumes" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.252308 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad1e990a-db38-4eb8-8ae9-6bd700728e48" path="/var/lib/kubelet/pods/ad1e990a-db38-4eb8-8ae9-6bd700728e48/volumes" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.253110 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d4ff70-611c-4a65-982c-f551baa66bd5" path="/var/lib/kubelet/pods/b4d4ff70-611c-4a65-982c-f551baa66bd5/volumes" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.254493 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c814051a-bbf1-4219-8089-8124cb1d3b7b" path="/var/lib/kubelet/pods/c814051a-bbf1-4219-8089-8124cb1d3b7b/volumes" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.255326 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeef9237-bd0b-494d-a3a0-8b6e54baa03e" path="/var/lib/kubelet/pods/eeef9237-bd0b-494d-a3a0-8b6e54baa03e/volumes" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.271463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jzxk\" (UniqueName: \"kubernetes.io/projected/87614809-814f-41aa-a98f-8d06b5875cd7-kube-api-access-9jzxk\") pod \"certified-operators-2wrp2\" (UID: \"87614809-814f-41aa-a98f-8d06b5875cd7\") " pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.346082 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-676mz\" (UniqueName: \"kubernetes.io/projected/9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2-kube-api-access-676mz\") pod \"redhat-marketplace-fhkcz\" (UID: \"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2\") " pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.346166 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2-utilities\") pod \"redhat-marketplace-fhkcz\" (UID: \"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2\") " pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.346196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2-catalog-content\") pod \"redhat-marketplace-fhkcz\" (UID: \"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2\") " pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.351612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.449521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-676mz\" (UniqueName: \"kubernetes.io/projected/9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2-kube-api-access-676mz\") pod \"redhat-marketplace-fhkcz\" (UID: \"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2\") " pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.450288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2-catalog-content\") pod \"redhat-marketplace-fhkcz\" (UID: \"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2\") " pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.450698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2-utilities\") pod \"redhat-marketplace-fhkcz\" (UID: \"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2\") " pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.451222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2-catalog-content\") pod \"redhat-marketplace-fhkcz\" (UID: \"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2\") " pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.451397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2-utilities\") pod \"redhat-marketplace-fhkcz\" (UID: \"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2\") " pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.467976 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-676mz\" (UniqueName: \"kubernetes.io/projected/9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2-kube-api-access-676mz\") pod \"redhat-marketplace-fhkcz\" (UID: \"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2\") " pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.547489 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.763851 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2wrp2"] Jan 29 03:34:05 crc kubenswrapper[4707]: W0129 03:34:05.768178 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87614809_814f_41aa_a98f_8d06b5875cd7.slice/crio-7f1fe75753313d2e581dc72ef785d20697dd5a8eb6559f02ce55dc61fcb8773e WatchSource:0}: Error finding container 7f1fe75753313d2e581dc72ef785d20697dd5a8eb6559f02ce55dc61fcb8773e: Status 404 returned error can't find the container with id 7f1fe75753313d2e581dc72ef785d20697dd5a8eb6559f02ce55dc61fcb8773e Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.948376 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhkcz"] Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.948907 4707 generic.go:334] "Generic (PLEG): container finished" podID="87614809-814f-41aa-a98f-8d06b5875cd7" containerID="d7533c8755a043ff59876397ca94c5c723ee5dc1a18bd7f5bdc15766880d7e44" exitCode=0 Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.949020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wrp2" event={"ID":"87614809-814f-41aa-a98f-8d06b5875cd7","Type":"ContainerDied","Data":"d7533c8755a043ff59876397ca94c5c723ee5dc1a18bd7f5bdc15766880d7e44"} Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.949096 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wrp2" event={"ID":"87614809-814f-41aa-a98f-8d06b5875cd7","Type":"ContainerStarted","Data":"7f1fe75753313d2e581dc72ef785d20697dd5a8eb6559f02ce55dc61fcb8773e"} Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.949415 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:05 crc kubenswrapper[4707]: I0129 03:34:05.953759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qwgfr" Jan 29 03:34:05 crc kubenswrapper[4707]: W0129 03:34:05.988187 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ae25a3b_cb3e_4ea6_9373_31d52ee5dda2.slice/crio-3c70a5639f80c069956f8aa3233d1138528e6b27960c4395ab33f5de96bbe2f1 WatchSource:0}: Error finding container 3c70a5639f80c069956f8aa3233d1138528e6b27960c4395ab33f5de96bbe2f1: Status 404 returned error can't find the container with id 3c70a5639f80c069956f8aa3233d1138528e6b27960c4395ab33f5de96bbe2f1 Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.814257 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ndnr5"] Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.815927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.818334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.824641 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ndnr5"] Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.954557 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2" containerID="51c9ed3bff53c1b4871b6887eb7ae63d6f259455b793f7fe33cf25ddaa216404" exitCode=0 Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.955814 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkcz" event={"ID":"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2","Type":"ContainerDied","Data":"51c9ed3bff53c1b4871b6887eb7ae63d6f259455b793f7fe33cf25ddaa216404"} Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.955863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkcz" event={"ID":"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2","Type":"ContainerStarted","Data":"3c70a5639f80c069956f8aa3233d1138528e6b27960c4395ab33f5de96bbe2f1"} Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.979709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgwwg\" (UniqueName: \"kubernetes.io/projected/a99f8ef9-ec05-437a-aec4-e7d7bb669485-kube-api-access-lgwwg\") pod \"community-operators-ndnr5\" (UID: \"a99f8ef9-ec05-437a-aec4-e7d7bb669485\") " pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.979771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99f8ef9-ec05-437a-aec4-e7d7bb669485-catalog-content\") pod \"community-operators-ndnr5\" (UID: \"a99f8ef9-ec05-437a-aec4-e7d7bb669485\") " pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:06 crc kubenswrapper[4707]: I0129 03:34:06.979942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99f8ef9-ec05-437a-aec4-e7d7bb669485-utilities\") pod \"community-operators-ndnr5\" (UID: \"a99f8ef9-ec05-437a-aec4-e7d7bb669485\") " pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.081917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99f8ef9-ec05-437a-aec4-e7d7bb669485-utilities\") pod \"community-operators-ndnr5\" (UID: \"a99f8ef9-ec05-437a-aec4-e7d7bb669485\") " pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.081997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgwwg\" (UniqueName: \"kubernetes.io/projected/a99f8ef9-ec05-437a-aec4-e7d7bb669485-kube-api-access-lgwwg\") pod \"community-operators-ndnr5\" (UID: \"a99f8ef9-ec05-437a-aec4-e7d7bb669485\") " pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.082039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99f8ef9-ec05-437a-aec4-e7d7bb669485-catalog-content\") pod \"community-operators-ndnr5\" (UID: \"a99f8ef9-ec05-437a-aec4-e7d7bb669485\") " pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.082569 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99f8ef9-ec05-437a-aec4-e7d7bb669485-utilities\") pod \"community-operators-ndnr5\" (UID: \"a99f8ef9-ec05-437a-aec4-e7d7bb669485\") " pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.082684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99f8ef9-ec05-437a-aec4-e7d7bb669485-catalog-content\") pod \"community-operators-ndnr5\" (UID: \"a99f8ef9-ec05-437a-aec4-e7d7bb669485\") " pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.104757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgwwg\" (UniqueName: \"kubernetes.io/projected/a99f8ef9-ec05-437a-aec4-e7d7bb669485-kube-api-access-lgwwg\") pod \"community-operators-ndnr5\" (UID: \"a99f8ef9-ec05-437a-aec4-e7d7bb669485\") " pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.133467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.555364 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ndnr5"] Jan 29 03:34:07 crc kubenswrapper[4707]: W0129 03:34:07.566304 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda99f8ef9_ec05_437a_aec4_e7d7bb669485.slice/crio-1bd30e405d9a7a096459cbed9f4685a99c73427e777f2e217df403b55c807829 WatchSource:0}: Error finding container 1bd30e405d9a7a096459cbed9f4685a99c73427e777f2e217df403b55c807829: Status 404 returned error can't find the container with id 1bd30e405d9a7a096459cbed9f4685a99c73427e777f2e217df403b55c807829 Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.962506 4707 generic.go:334] "Generic (PLEG): container finished" podID="87614809-814f-41aa-a98f-8d06b5875cd7" containerID="1b1fb47b179f85791f1894f00fad0a7e6e68e2572285af95a424218cbf9fae6a" exitCode=0 Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.962613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wrp2" event={"ID":"87614809-814f-41aa-a98f-8d06b5875cd7","Type":"ContainerDied","Data":"1b1fb47b179f85791f1894f00fad0a7e6e68e2572285af95a424218cbf9fae6a"} Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.963871 4707 generic.go:334] "Generic (PLEG): container finished" podID="a99f8ef9-ec05-437a-aec4-e7d7bb669485" containerID="b039d4b5221d65b3cbbe283d87ba8f221cb3fb0c2a2c76da9168a5664c1f0406" exitCode=0 Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.963952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndnr5" event={"ID":"a99f8ef9-ec05-437a-aec4-e7d7bb669485","Type":"ContainerDied","Data":"b039d4b5221d65b3cbbe283d87ba8f221cb3fb0c2a2c76da9168a5664c1f0406"} Jan 29 03:34:07 crc kubenswrapper[4707]: I0129 03:34:07.963992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndnr5" event={"ID":"a99f8ef9-ec05-437a-aec4-e7d7bb669485","Type":"ContainerStarted","Data":"1bd30e405d9a7a096459cbed9f4685a99c73427e777f2e217df403b55c807829"} Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.017688 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cb77w"] Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.019474 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.021442 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cb77w"] Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.022180 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.099495 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b4a55f4-b7e6-454e-a8f7-066ce8edb801-utilities\") pod \"redhat-operators-cb77w\" (UID: \"0b4a55f4-b7e6-454e-a8f7-066ce8edb801\") " pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.099605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2c2f\" (UniqueName: \"kubernetes.io/projected/0b4a55f4-b7e6-454e-a8f7-066ce8edb801-kube-api-access-k2c2f\") pod \"redhat-operators-cb77w\" (UID: \"0b4a55f4-b7e6-454e-a8f7-066ce8edb801\") " pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.099643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b4a55f4-b7e6-454e-a8f7-066ce8edb801-catalog-content\") pod \"redhat-operators-cb77w\" (UID: \"0b4a55f4-b7e6-454e-a8f7-066ce8edb801\") " pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.200390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b4a55f4-b7e6-454e-a8f7-066ce8edb801-utilities\") pod \"redhat-operators-cb77w\" (UID: \"0b4a55f4-b7e6-454e-a8f7-066ce8edb801\") " pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.200792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2c2f\" (UniqueName: \"kubernetes.io/projected/0b4a55f4-b7e6-454e-a8f7-066ce8edb801-kube-api-access-k2c2f\") pod \"redhat-operators-cb77w\" (UID: \"0b4a55f4-b7e6-454e-a8f7-066ce8edb801\") " pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.200952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b4a55f4-b7e6-454e-a8f7-066ce8edb801-catalog-content\") pod \"redhat-operators-cb77w\" (UID: \"0b4a55f4-b7e6-454e-a8f7-066ce8edb801\") " pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.200960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b4a55f4-b7e6-454e-a8f7-066ce8edb801-utilities\") pod \"redhat-operators-cb77w\" (UID: \"0b4a55f4-b7e6-454e-a8f7-066ce8edb801\") " pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.201248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b4a55f4-b7e6-454e-a8f7-066ce8edb801-catalog-content\") pod \"redhat-operators-cb77w\" (UID: \"0b4a55f4-b7e6-454e-a8f7-066ce8edb801\") " pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.220743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2c2f\" (UniqueName: \"kubernetes.io/projected/0b4a55f4-b7e6-454e-a8f7-066ce8edb801-kube-api-access-k2c2f\") pod \"redhat-operators-cb77w\" (UID: \"0b4a55f4-b7e6-454e-a8f7-066ce8edb801\") " pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.373128 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.778947 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cb77w"] Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.978635 4707 generic.go:334] "Generic (PLEG): container finished" podID="0b4a55f4-b7e6-454e-a8f7-066ce8edb801" containerID="de5bec9f325c105509a12e287725a81f901e46b0cd88692e24950b08926c1d87" exitCode=0 Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.978727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb77w" event={"ID":"0b4a55f4-b7e6-454e-a8f7-066ce8edb801","Type":"ContainerDied","Data":"de5bec9f325c105509a12e287725a81f901e46b0cd88692e24950b08926c1d87"} Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.979139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb77w" event={"ID":"0b4a55f4-b7e6-454e-a8f7-066ce8edb801","Type":"ContainerStarted","Data":"c2a5306d58864fcbbfa695bec2fca8e1f96be0b24cb4f84fc42802b3f1edf64f"} Jan 29 03:34:08 crc kubenswrapper[4707]: I0129 03:34:08.982645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wrp2" event={"ID":"87614809-814f-41aa-a98f-8d06b5875cd7","Type":"ContainerStarted","Data":"ac28735b85159cea066ec9126046b7b2c3b98e19f5f4f3d9e6c286f0f00bb866"} Jan 29 03:34:09 crc kubenswrapper[4707]: I0129 03:34:09.990230 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2" containerID="dfc0e01852fd6f2b8518961465dddb81e975e281deb02f35022bfd6bfab64041" exitCode=0 Jan 29 03:34:09 crc kubenswrapper[4707]: I0129 03:34:09.990306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkcz" event={"ID":"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2","Type":"ContainerDied","Data":"dfc0e01852fd6f2b8518961465dddb81e975e281deb02f35022bfd6bfab64041"} Jan 29 03:34:10 crc kubenswrapper[4707]: I0129 03:34:10.020133 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2wrp2" podStartSLOduration=3.6206636789999997 podStartE2EDuration="6.020104699s" podCreationTimestamp="2026-01-29 03:34:04 +0000 UTC" firstStartedPulling="2026-01-29 03:34:05.955599775 +0000 UTC m=+399.439828690" lastFinishedPulling="2026-01-29 03:34:08.355040805 +0000 UTC m=+401.839269710" observedRunningTime="2026-01-29 03:34:09.01365442 +0000 UTC m=+402.497883345" watchObservedRunningTime="2026-01-29 03:34:10.020104699 +0000 UTC m=+403.504333604" Jan 29 03:34:11 crc kubenswrapper[4707]: I0129 03:34:11.000083 4707 generic.go:334] "Generic (PLEG): container finished" podID="0b4a55f4-b7e6-454e-a8f7-066ce8edb801" containerID="bdc85b84113567b6a940e7e275199be5220ba4c0f9ac86412ed690fac9cb0ef6" exitCode=0 Jan 29 03:34:11 crc kubenswrapper[4707]: I0129 03:34:11.000192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb77w" event={"ID":"0b4a55f4-b7e6-454e-a8f7-066ce8edb801","Type":"ContainerDied","Data":"bdc85b84113567b6a940e7e275199be5220ba4c0f9ac86412ed690fac9cb0ef6"} Jan 29 03:34:11 crc kubenswrapper[4707]: I0129 03:34:11.005027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkcz" event={"ID":"9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2","Type":"ContainerStarted","Data":"b2f875cd718f550c8f5e5a63b02e2dd81bb2f9f23e1b3c15c255cb0416af0b48"} Jan 29 03:34:11 crc kubenswrapper[4707]: I0129 03:34:11.008804 4707 generic.go:334] "Generic (PLEG): container finished" podID="a99f8ef9-ec05-437a-aec4-e7d7bb669485" containerID="4d08d465eaeb068aaca4efb524f7dbce88e63989dc5f103cdee4d8aae1bead10" exitCode=0 Jan 29 03:34:11 crc kubenswrapper[4707]: I0129 03:34:11.008847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndnr5" event={"ID":"a99f8ef9-ec05-437a-aec4-e7d7bb669485","Type":"ContainerDied","Data":"4d08d465eaeb068aaca4efb524f7dbce88e63989dc5f103cdee4d8aae1bead10"} Jan 29 03:34:11 crc kubenswrapper[4707]: I0129 03:34:11.076919 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fhkcz" podStartSLOduration=2.629529159 podStartE2EDuration="6.076889583s" podCreationTimestamp="2026-01-29 03:34:05 +0000 UTC" firstStartedPulling="2026-01-29 03:34:06.956345914 +0000 UTC m=+400.440574819" lastFinishedPulling="2026-01-29 03:34:10.403706338 +0000 UTC m=+403.887935243" observedRunningTime="2026-01-29 03:34:11.076505103 +0000 UTC m=+404.560734028" watchObservedRunningTime="2026-01-29 03:34:11.076889583 +0000 UTC m=+404.561118488" Jan 29 03:34:12 crc kubenswrapper[4707]: I0129 03:34:12.017984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb77w" event={"ID":"0b4a55f4-b7e6-454e-a8f7-066ce8edb801","Type":"ContainerStarted","Data":"87d4e399adf35385b6e258be03b11bb8f507e42211a7e5790a8319f5dde07f9f"} Jan 29 03:34:12 crc kubenswrapper[4707]: I0129 03:34:12.021174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndnr5" event={"ID":"a99f8ef9-ec05-437a-aec4-e7d7bb669485","Type":"ContainerStarted","Data":"136b30e2a33b8b17b493c09176c1e90b301846a6f263c166b95488e4d146f34c"} Jan 29 03:34:12 crc kubenswrapper[4707]: I0129 03:34:12.039639 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cb77w" podStartSLOduration=2.577725416 podStartE2EDuration="5.039620832s" podCreationTimestamp="2026-01-29 03:34:07 +0000 UTC" firstStartedPulling="2026-01-29 03:34:08.980780985 +0000 UTC m=+402.465009890" lastFinishedPulling="2026-01-29 03:34:11.442676401 +0000 UTC m=+404.926905306" observedRunningTime="2026-01-29 03:34:12.038052818 +0000 UTC m=+405.522281713" watchObservedRunningTime="2026-01-29 03:34:12.039620832 +0000 UTC m=+405.523849737" Jan 29 03:34:12 crc kubenswrapper[4707]: I0129 03:34:12.060891 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ndnr5" podStartSLOduration=2.644766903 podStartE2EDuration="6.060866629s" podCreationTimestamp="2026-01-29 03:34:06 +0000 UTC" firstStartedPulling="2026-01-29 03:34:07.966013162 +0000 UTC m=+401.450242067" lastFinishedPulling="2026-01-29 03:34:11.382112888 +0000 UTC m=+404.866341793" observedRunningTime="2026-01-29 03:34:12.058209344 +0000 UTC m=+405.542438249" watchObservedRunningTime="2026-01-29 03:34:12.060866629 +0000 UTC m=+405.545095534" Jan 29 03:34:12 crc kubenswrapper[4707]: I0129 03:34:12.345607 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" podUID="fe4f03a9-cb43-4405-902b-eb2cdb645eb8" containerName="registry" containerID="cri-o://0be403bbfa67fbb6ba49df6a32247461d85f19802e4c33c7e4807f4cf38656a8" gracePeriod=30 Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.027769 4707 generic.go:334] "Generic (PLEG): container finished" podID="fe4f03a9-cb43-4405-902b-eb2cdb645eb8" containerID="0be403bbfa67fbb6ba49df6a32247461d85f19802e4c33c7e4807f4cf38656a8" exitCode=0 Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.027868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" event={"ID":"fe4f03a9-cb43-4405-902b-eb2cdb645eb8","Type":"ContainerDied","Data":"0be403bbfa67fbb6ba49df6a32247461d85f19802e4c33c7e4807f4cf38656a8"} Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.028266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" event={"ID":"fe4f03a9-cb43-4405-902b-eb2cdb645eb8","Type":"ContainerDied","Data":"f1cd557f639d6178c7eb2d82cc29114b1330785da91b11e2283763b91307715a"} Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.028282 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1cd557f639d6178c7eb2d82cc29114b1330785da91b11e2283763b91307715a" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.470376 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.673426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-installation-pull-secrets\") pod \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.673668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.673739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-certificates\") pod \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.673773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-ca-trust-extracted\") pod \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.673811 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-bound-sa-token\") pod \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.673844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-trusted-ca\") pod \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.673870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-tls\") pod \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.673948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs72p\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-kube-api-access-hs72p\") pod \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\" (UID: \"fe4f03a9-cb43-4405-902b-eb2cdb645eb8\") " Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.675369 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fe4f03a9-cb43-4405-902b-eb2cdb645eb8" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.675523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fe4f03a9-cb43-4405-902b-eb2cdb645eb8" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.681632 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fe4f03a9-cb43-4405-902b-eb2cdb645eb8" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.681889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fe4f03a9-cb43-4405-902b-eb2cdb645eb8" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.682284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fe4f03a9-cb43-4405-902b-eb2cdb645eb8" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.682423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-kube-api-access-hs72p" (OuterVolumeSpecName: "kube-api-access-hs72p") pod "fe4f03a9-cb43-4405-902b-eb2cdb645eb8" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8"). InnerVolumeSpecName "kube-api-access-hs72p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.685502 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fe4f03a9-cb43-4405-902b-eb2cdb645eb8" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.693720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fe4f03a9-cb43-4405-902b-eb2cdb645eb8" (UID: "fe4f03a9-cb43-4405-902b-eb2cdb645eb8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.774803 4707 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.775109 4707 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.775120 4707 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.775129 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.775138 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.775146 4707 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:13 crc kubenswrapper[4707]: I0129 03:34:13.775156 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs72p\" (UniqueName: \"kubernetes.io/projected/fe4f03a9-cb43-4405-902b-eb2cdb645eb8-kube-api-access-hs72p\") on node \"crc\" DevicePath \"\"" Jan 29 03:34:14 crc kubenswrapper[4707]: I0129 03:34:14.033846 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-z575b" Jan 29 03:34:14 crc kubenswrapper[4707]: I0129 03:34:14.068037 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z575b"] Jan 29 03:34:14 crc kubenswrapper[4707]: I0129 03:34:14.072305 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z575b"] Jan 29 03:34:15 crc kubenswrapper[4707]: I0129 03:34:15.251701 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4f03a9-cb43-4405-902b-eb2cdb645eb8" path="/var/lib/kubelet/pods/fe4f03a9-cb43-4405-902b-eb2cdb645eb8/volumes" Jan 29 03:34:15 crc kubenswrapper[4707]: I0129 03:34:15.352521 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:15 crc kubenswrapper[4707]: I0129 03:34:15.353163 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:15 crc kubenswrapper[4707]: I0129 03:34:15.420107 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:15 crc kubenswrapper[4707]: I0129 03:34:15.547813 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:15 crc kubenswrapper[4707]: I0129 03:34:15.547888 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:15 crc kubenswrapper[4707]: I0129 03:34:15.586053 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:16 crc kubenswrapper[4707]: I0129 03:34:16.097922 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2wrp2" Jan 29 03:34:16 crc kubenswrapper[4707]: I0129 03:34:16.101752 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fhkcz" Jan 29 03:34:17 crc kubenswrapper[4707]: I0129 03:34:17.133952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:17 crc kubenswrapper[4707]: I0129 03:34:17.134011 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:17 crc kubenswrapper[4707]: I0129 03:34:17.212236 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:18 crc kubenswrapper[4707]: I0129 03:34:18.112082 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ndnr5" Jan 29 03:34:18 crc kubenswrapper[4707]: I0129 03:34:18.373928 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:18 crc kubenswrapper[4707]: I0129 03:34:18.374010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:18 crc kubenswrapper[4707]: I0129 03:34:18.417964 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:34:19 crc kubenswrapper[4707]: I0129 03:34:19.118687 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cb77w" Jan 29 03:36:03 crc kubenswrapper[4707]: I0129 03:36:03.463196 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:36:03 crc kubenswrapper[4707]: I0129 03:36:03.464522 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:36:27 crc kubenswrapper[4707]: I0129 03:36:27.517172 4707 scope.go:117] "RemoveContainer" containerID="0be403bbfa67fbb6ba49df6a32247461d85f19802e4c33c7e4807f4cf38656a8" Jan 29 03:36:27 crc kubenswrapper[4707]: I0129 03:36:27.546913 4707 scope.go:117] "RemoveContainer" containerID="d40a66ec96f5ca57e43ff814f15667bfd22c3643ec24b11e058c97106c72c348" Jan 29 03:36:33 crc kubenswrapper[4707]: I0129 03:36:33.463470 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:36:33 crc kubenswrapper[4707]: I0129 03:36:33.464104 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:37:03 crc kubenswrapper[4707]: I0129 03:37:03.463235 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:37:03 crc kubenswrapper[4707]: I0129 03:37:03.464271 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:37:03 crc kubenswrapper[4707]: I0129 03:37:03.464372 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:37:03 crc kubenswrapper[4707]: I0129 03:37:03.465698 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d626ab43efa55857c89f32b874520ad65ab395b5a2359de01e47becbb927c08"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 03:37:03 crc kubenswrapper[4707]: I0129 03:37:03.465796 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://7d626ab43efa55857c89f32b874520ad65ab395b5a2359de01e47becbb927c08" gracePeriod=600 Jan 29 03:37:04 crc kubenswrapper[4707]: I0129 03:37:04.222461 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="7d626ab43efa55857c89f32b874520ad65ab395b5a2359de01e47becbb927c08" exitCode=0 Jan 29 03:37:04 crc kubenswrapper[4707]: I0129 03:37:04.222617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"7d626ab43efa55857c89f32b874520ad65ab395b5a2359de01e47becbb927c08"} Jan 29 03:37:04 crc kubenswrapper[4707]: I0129 03:37:04.222979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"fdbfa93f1cdcabdbaa105826dff7f462fdab9740abd38ddc05a6f8c6801cf011"} Jan 29 03:37:04 crc kubenswrapper[4707]: I0129 03:37:04.223016 4707 scope.go:117] "RemoveContainer" containerID="2018b8d36afa3f2a5c920f93a22bd21150b05028d7be0b59b8d8babfd9ed3779" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.529064 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-h296l"] Jan 29 03:38:36 crc kubenswrapper[4707]: E0129 03:38:36.530287 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4f03a9-cb43-4405-902b-eb2cdb645eb8" containerName="registry" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.530310 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4f03a9-cb43-4405-902b-eb2cdb645eb8" containerName="registry" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.530488 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe4f03a9-cb43-4405-902b-eb2cdb645eb8" containerName="registry" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.531165 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-h296l" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.531850 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-khpr5"] Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.534428 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.534461 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8dspg" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.534747 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.541740 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-khpr5" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.543633 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-gbpk6" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.546648 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vbkth"] Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.547398 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.549049 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-spjf7" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.581570 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-h296l"] Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.588650 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-khpr5"] Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.594063 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vbkth"] Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.667941 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dh5p\" (UniqueName: \"kubernetes.io/projected/e32edc95-69cc-48f9-8840-a8bba34d4649-kube-api-access-9dh5p\") pod \"cert-manager-webhook-687f57d79b-vbkth\" (UID: \"e32edc95-69cc-48f9-8840-a8bba34d4649\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.668011 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tphj7\" (UniqueName: \"kubernetes.io/projected/2eac8e69-1dad-4d12-bdc5-24cb7659f04d-kube-api-access-tphj7\") pod \"cert-manager-858654f9db-khpr5\" (UID: \"2eac8e69-1dad-4d12-bdc5-24cb7659f04d\") " pod="cert-manager/cert-manager-858654f9db-khpr5" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.668060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lzzd\" (UniqueName: \"kubernetes.io/projected/ef3e8867-84ba-49f9-878e-482ae14faaa7-kube-api-access-4lzzd\") pod \"cert-manager-cainjector-cf98fcc89-h296l\" (UID: \"ef3e8867-84ba-49f9-878e-482ae14faaa7\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-h296l" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.769252 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tphj7\" (UniqueName: \"kubernetes.io/projected/2eac8e69-1dad-4d12-bdc5-24cb7659f04d-kube-api-access-tphj7\") pod \"cert-manager-858654f9db-khpr5\" (UID: \"2eac8e69-1dad-4d12-bdc5-24cb7659f04d\") " pod="cert-manager/cert-manager-858654f9db-khpr5" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.769444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lzzd\" (UniqueName: \"kubernetes.io/projected/ef3e8867-84ba-49f9-878e-482ae14faaa7-kube-api-access-4lzzd\") pod \"cert-manager-cainjector-cf98fcc89-h296l\" (UID: \"ef3e8867-84ba-49f9-878e-482ae14faaa7\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-h296l" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.769534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dh5p\" (UniqueName: \"kubernetes.io/projected/e32edc95-69cc-48f9-8840-a8bba34d4649-kube-api-access-9dh5p\") pod \"cert-manager-webhook-687f57d79b-vbkth\" (UID: \"e32edc95-69cc-48f9-8840-a8bba34d4649\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.789162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lzzd\" (UniqueName: \"kubernetes.io/projected/ef3e8867-84ba-49f9-878e-482ae14faaa7-kube-api-access-4lzzd\") pod \"cert-manager-cainjector-cf98fcc89-h296l\" (UID: \"ef3e8867-84ba-49f9-878e-482ae14faaa7\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-h296l" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.792450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dh5p\" (UniqueName: \"kubernetes.io/projected/e32edc95-69cc-48f9-8840-a8bba34d4649-kube-api-access-9dh5p\") pod \"cert-manager-webhook-687f57d79b-vbkth\" (UID: \"e32edc95-69cc-48f9-8840-a8bba34d4649\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.792813 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tphj7\" (UniqueName: \"kubernetes.io/projected/2eac8e69-1dad-4d12-bdc5-24cb7659f04d-kube-api-access-tphj7\") pod \"cert-manager-858654f9db-khpr5\" (UID: \"2eac8e69-1dad-4d12-bdc5-24cb7659f04d\") " pod="cert-manager/cert-manager-858654f9db-khpr5" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.852698 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-h296l" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.864052 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-khpr5" Jan 29 03:38:36 crc kubenswrapper[4707]: I0129 03:38:36.873933 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" Jan 29 03:38:37 crc kubenswrapper[4707]: I0129 03:38:37.313322 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-h296l"] Jan 29 03:38:37 crc kubenswrapper[4707]: I0129 03:38:37.321830 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 03:38:37 crc kubenswrapper[4707]: I0129 03:38:37.360383 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-khpr5"] Jan 29 03:38:37 crc kubenswrapper[4707]: W0129 03:38:37.364391 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode32edc95_69cc_48f9_8840_a8bba34d4649.slice/crio-194cfb549cba87aecd1a99cc4c8677aace28d04f77b0095c0d27156b93c14a8a WatchSource:0}: Error finding container 194cfb549cba87aecd1a99cc4c8677aace28d04f77b0095c0d27156b93c14a8a: Status 404 returned error can't find the container with id 194cfb549cba87aecd1a99cc4c8677aace28d04f77b0095c0d27156b93c14a8a Jan 29 03:38:37 crc kubenswrapper[4707]: I0129 03:38:37.367497 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vbkth"] Jan 29 03:38:37 crc kubenswrapper[4707]: I0129 03:38:37.924027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" event={"ID":"e32edc95-69cc-48f9-8840-a8bba34d4649","Type":"ContainerStarted","Data":"194cfb549cba87aecd1a99cc4c8677aace28d04f77b0095c0d27156b93c14a8a"} Jan 29 03:38:37 crc kubenswrapper[4707]: I0129 03:38:37.925654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-khpr5" event={"ID":"2eac8e69-1dad-4d12-bdc5-24cb7659f04d","Type":"ContainerStarted","Data":"a363bd38fbcd26b953394096a11fed58370dfc934d44735ec6595153d7bdc4a9"} Jan 29 03:38:37 crc kubenswrapper[4707]: I0129 03:38:37.926781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-h296l" event={"ID":"ef3e8867-84ba-49f9-878e-482ae14faaa7","Type":"ContainerStarted","Data":"d622aa571f411b303a50efc1f5f387631ee54636416d29e13c346a976bf60c14"} Jan 29 03:38:41 crc kubenswrapper[4707]: I0129 03:38:41.954239 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" event={"ID":"e32edc95-69cc-48f9-8840-a8bba34d4649","Type":"ContainerStarted","Data":"df5edb7adbd8dc0ae753ed966417dc031ef186d64eb984777883d0e1b81f13fa"} Jan 29 03:38:41 crc kubenswrapper[4707]: I0129 03:38:41.954816 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" Jan 29 03:38:41 crc kubenswrapper[4707]: I0129 03:38:41.957749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-khpr5" event={"ID":"2eac8e69-1dad-4d12-bdc5-24cb7659f04d","Type":"ContainerStarted","Data":"581155fa22cac1f7f4774c91f1d703578be3b515694204d1e97d60acb5f6f767"} Jan 29 03:38:41 crc kubenswrapper[4707]: I0129 03:38:41.960235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-h296l" event={"ID":"ef3e8867-84ba-49f9-878e-482ae14faaa7","Type":"ContainerStarted","Data":"84f4e9b55101018a3b03ec8c428d11f08646f94c1b1e6015b801706fd02ef897"} Jan 29 03:38:41 crc kubenswrapper[4707]: I0129 03:38:41.981114 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" podStartSLOduration=1.762255531 podStartE2EDuration="5.981084872s" podCreationTimestamp="2026-01-29 03:38:36 +0000 UTC" firstStartedPulling="2026-01-29 03:38:37.367420465 +0000 UTC m=+670.851649360" lastFinishedPulling="2026-01-29 03:38:41.586249796 +0000 UTC m=+675.070478701" observedRunningTime="2026-01-29 03:38:41.976281205 +0000 UTC m=+675.460510150" watchObservedRunningTime="2026-01-29 03:38:41.981084872 +0000 UTC m=+675.465313777" Jan 29 03:38:41 crc kubenswrapper[4707]: I0129 03:38:41.995968 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-h296l" podStartSLOduration=1.641857726 podStartE2EDuration="5.995942376s" podCreationTimestamp="2026-01-29 03:38:36 +0000 UTC" firstStartedPulling="2026-01-29 03:38:37.321520225 +0000 UTC m=+670.805749130" lastFinishedPulling="2026-01-29 03:38:41.675604865 +0000 UTC m=+675.159833780" observedRunningTime="2026-01-29 03:38:41.994906177 +0000 UTC m=+675.479135092" watchObservedRunningTime="2026-01-29 03:38:41.995942376 +0000 UTC m=+675.480171281" Jan 29 03:38:42 crc kubenswrapper[4707]: I0129 03:38:42.028184 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-khpr5" podStartSLOduration=2.230632346 podStartE2EDuration="6.028157815s" podCreationTimestamp="2026-01-29 03:38:36 +0000 UTC" firstStartedPulling="2026-01-29 03:38:37.363354809 +0000 UTC m=+670.847583724" lastFinishedPulling="2026-01-29 03:38:41.160880248 +0000 UTC m=+674.645109193" observedRunningTime="2026-01-29 03:38:42.022927846 +0000 UTC m=+675.507156761" watchObservedRunningTime="2026-01-29 03:38:42.028157815 +0000 UTC m=+675.512386730" Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.650142 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nn7fm"] Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.651110 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovn-controller" containerID="cri-o://e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125" gracePeriod=30 Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.651165 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="nbdb" containerID="cri-o://d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394" gracePeriod=30 Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.651200 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4" gracePeriod=30 Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.651288 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="sbdb" containerID="cri-o://fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89" gracePeriod=30 Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.651288 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="northd" containerID="cri-o://4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63" gracePeriod=30 Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.651294 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kube-rbac-proxy-node" containerID="cri-o://058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943" gracePeriod=30 Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.651332 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovn-acl-logging" containerID="cri-o://6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55" gracePeriod=30 Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.681740 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" containerID="cri-o://e6ba46a3b002e4a00eceb67f566d451ea3eff8379adb7ccdc4ae2f8298abd464" gracePeriod=30 Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.877523 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-vbkth" Jan 29 03:38:46 crc kubenswrapper[4707]: I0129 03:38:46.999875 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovnkube-controller/3.log" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.003988 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovn-acl-logging/0.log" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.004886 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovn-controller/0.log" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005278 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="e6ba46a3b002e4a00eceb67f566d451ea3eff8379adb7ccdc4ae2f8298abd464" exitCode=0 Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005311 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89" exitCode=0 Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005321 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394" exitCode=0 Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005330 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63" exitCode=0 Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005338 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4" exitCode=0 Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005347 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943" exitCode=0 Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005357 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55" exitCode=143 Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005365 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerID="e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125" exitCode=143 Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"e6ba46a3b002e4a00eceb67f566d451ea3eff8379adb7ccdc4ae2f8298abd464"} Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89"} Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005471 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394"} Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63"} Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4"} Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943"} Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005513 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55"} Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125"} Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.005559 4707 scope.go:117] "RemoveContainer" containerID="bc1180ed19d2e404801e2b2380935c0e74a87499c86d1f037ec60f9e4b9c894d" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.008510 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/2.log" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.009206 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/1.log" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.009302 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd938209-46da-4f33-8496-23beb193ac96" containerID="5655bbc11ea24b509e78271170c1b3b66ff0b6788c59aa6680676258a96736b3" exitCode=2 Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.009339 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vh9xt" event={"ID":"bd938209-46da-4f33-8496-23beb193ac96","Type":"ContainerDied","Data":"5655bbc11ea24b509e78271170c1b3b66ff0b6788c59aa6680676258a96736b3"} Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.009997 4707 scope.go:117] "RemoveContainer" containerID="5655bbc11ea24b509e78271170c1b3b66ff0b6788c59aa6680676258a96736b3" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.010256 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vh9xt_openshift-multus(bd938209-46da-4f33-8496-23beb193ac96)\"" pod="openshift-multus/multus-vh9xt" podUID="bd938209-46da-4f33-8496-23beb193ac96" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.014950 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovn-acl-logging/0.log" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.016874 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovn-controller/0.log" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.017933 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.037666 4707 scope.go:117] "RemoveContainer" containerID="8dd2fbdf8bccce6a7f488354debd2bcd7b189cf8f8d866dfe9af0c453a8e6965" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.105780 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x7n4x"] Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106117 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="northd" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106139 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="northd" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106154 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovn-acl-logging" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106164 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovn-acl-logging" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106175 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovn-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106183 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovn-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106234 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="sbdb" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106245 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="sbdb" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106259 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kubecfg-setup" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106266 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kubecfg-setup" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106275 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kube-rbac-proxy-node" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106282 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kube-rbac-proxy-node" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106311 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106319 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106333 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="nbdb" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106342 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="nbdb" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106354 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106360 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106413 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106423 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106444 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kube-rbac-proxy-ovn-metrics" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106452 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kube-rbac-proxy-ovn-metrics" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106604 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="northd" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106618 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106628 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="sbdb" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106638 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kube-rbac-proxy-ovn-metrics" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106653 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="kube-rbac-proxy-node" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106665 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106675 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106684 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="nbdb" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106694 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovn-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106702 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106711 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovn-acl-logging" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106834 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106845 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: E0129 03:38:47.106857 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106865 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.106996 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" containerName="ovnkube-controller" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.109254 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136381 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-script-lib\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-var-lib-openvswitch\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-openvswitch\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-env-overrides\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-etc-openvswitch\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-systemd\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-netns\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-kubelet\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-log-socket\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-slash\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136792 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-node-log\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovn-node-metrics-cert\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136970 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-systemd-units\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.136977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137027 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxg4m\" (UniqueName: \"kubernetes.io/projected/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-kube-api-access-wxg4m\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-config\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137293 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-ovn-kubernetes\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-ovn\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-netd\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137408 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-bin\") pod \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\" (UID: \"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a\") " Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137852 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137410 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137590 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137636 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-log-socket" (OuterVolumeSpecName: "log-socket") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-slash" (OuterVolumeSpecName: "host-slash") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.137713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-node-log" (OuterVolumeSpecName: "node-log") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.138104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.138148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.138619 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.138646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.139336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.139366 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.147273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-kube-api-access-wxg4m" (OuterVolumeSpecName: "kube-api-access-wxg4m") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "kube-api-access-wxg4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.149097 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.169112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" (UID: "f3eccef7-1d8e-42b5-b7c8-2cd378b7465a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.239236 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-log-socket\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.239292 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.239316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-systemd-units\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.239335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-slash\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.239475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-cni-netd\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.239711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6kl\" (UniqueName: \"kubernetes.io/projected/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-kube-api-access-mj6kl\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.239860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-var-lib-openvswitch\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.239904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-cni-bin\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.239993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-node-log\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-etc-openvswitch\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-run-ovn\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-ovnkube-config\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-ovnkube-script-lib\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-run-systemd\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240314 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-run-openvswitch\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-env-overrides\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-kubelet\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-run-netns\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-ovn-node-metrics-cert\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240725 4707 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240755 4707 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240768 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240785 4707 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240797 4707 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240810 4707 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240822 4707 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240837 4707 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-slash\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240849 4707 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-log-socket\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240861 4707 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-node-log\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240875 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240890 4707 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240907 4707 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240923 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxg4m\" (UniqueName: \"kubernetes.io/projected/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-kube-api-access-wxg4m\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240936 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240950 4707 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240963 4707 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240974 4707 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.240989 4707 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-var-lib-openvswitch\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-cni-bin\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-node-log\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-etc-openvswitch\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-var-lib-openvswitch\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-run-ovn\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-cni-bin\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-node-log\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-etc-openvswitch\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-run-ovn\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-ovnkube-config\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-ovnkube-script-lib\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-run-systemd\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-run-openvswitch\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-env-overrides\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.342978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-kubelet\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-run-netns\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-ovn-node-metrics-cert\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343145 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-log-socket\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-systemd-units\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-slash\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-run-netns\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343705 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-kubelet\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-run-openvswitch\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-ovnkube-script-lib\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.344094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-systemd-units\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.344130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-slash\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.343540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-cni-netd\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.344159 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-log-socket\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.344155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-cni-netd\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.344294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-host-run-ovn-kubernetes\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.344305 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6kl\" (UniqueName: \"kubernetes.io/projected/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-kube-api-access-mj6kl\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.344411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-ovnkube-config\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.344428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-run-systemd\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.344455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-env-overrides\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.347509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-ovn-node-metrics-cert\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.372115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6kl\" (UniqueName: \"kubernetes.io/projected/a37a4cfa-df5d-4086-9de6-5d8bcec0772a-kube-api-access-mj6kl\") pod \"ovnkube-node-x7n4x\" (UID: \"a37a4cfa-df5d-4086-9de6-5d8bcec0772a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:47 crc kubenswrapper[4707]: I0129 03:38:47.426328 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.025225 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/2.log" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.028068 4707 generic.go:334] "Generic (PLEG): container finished" podID="a37a4cfa-df5d-4086-9de6-5d8bcec0772a" containerID="217ec293e5f0855a3d873c62e4ad0e4e1224cdc5e3e4ba431c1cc69c65631b83" exitCode=0 Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.028172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerDied","Data":"217ec293e5f0855a3d873c62e4ad0e4e1224cdc5e3e4ba431c1cc69c65631b83"} Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.028241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerStarted","Data":"1d79f893b11ae889d7be22cd7ed59fcdbce5494a71e2acc6904d005bc5fbb178"} Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.035564 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovn-acl-logging/0.log" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.036391 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nn7fm_f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/ovn-controller/0.log" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.037065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" event={"ID":"f3eccef7-1d8e-42b5-b7c8-2cd378b7465a","Type":"ContainerDied","Data":"6669b176e855d9fbffff2fdbd7431d03f35e4b019d06e6672dcb1aad7085471b"} Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.037124 4707 scope.go:117] "RemoveContainer" containerID="e6ba46a3b002e4a00eceb67f566d451ea3eff8379adb7ccdc4ae2f8298abd464" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.039805 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nn7fm" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.073183 4707 scope.go:117] "RemoveContainer" containerID="fc8b384a386d3e5d501bea4b9b8ac903bd2c07dc2e1ddb7250148bdc57631c89" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.125877 4707 scope.go:117] "RemoveContainer" containerID="d93822ea3d68f822c95efdfc00e711f0d453e639d6ce82f52c91b32a399d8394" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.149404 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nn7fm"] Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.157067 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nn7fm"] Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.160074 4707 scope.go:117] "RemoveContainer" containerID="4b77b0b73b61311ef46b70fdd5d9f92dcf0f00c409f5b606b15cfa2a62ae5a63" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.183939 4707 scope.go:117] "RemoveContainer" containerID="fd49e3f146a06e82e4335112da21717027df21ce7ccea68cdc96240dc8ec28c4" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.203071 4707 scope.go:117] "RemoveContainer" containerID="058a76c311aad55d58913f7d2bd1968d5c904ca4c4216d5623d9135edf39c943" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.220552 4707 scope.go:117] "RemoveContainer" containerID="6122d3cbbac8b9305af3e956f4b9b3304ed739bae3fc5ce9651d10a03900ea55" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.240319 4707 scope.go:117] "RemoveContainer" containerID="e00264ac843fecb6248111f523b29f121f8d28c77ee84e3f990477be5c9f2125" Jan 29 03:38:48 crc kubenswrapper[4707]: I0129 03:38:48.259333 4707 scope.go:117] "RemoveContainer" containerID="3809b7892f47c4c8760446721ea73e9aed5b65d76eb6702d9a53a58c04569463" Jan 29 03:38:49 crc kubenswrapper[4707]: I0129 03:38:49.054978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerStarted","Data":"305b81dd79bc52375c845ae0b59911f63baa50c7c7f6f1702a46ae86d95d31f0"} Jan 29 03:38:49 crc kubenswrapper[4707]: I0129 03:38:49.055040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerStarted","Data":"c4ba1ad817fd06ff35194405f2ea2b5aa50d84c59b98c6ce8c3243fae42b0551"} Jan 29 03:38:49 crc kubenswrapper[4707]: I0129 03:38:49.055053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerStarted","Data":"141ac65dced27037f2b80cc7e3b918960b7df0f05549f49fcbe295670b5c9a50"} Jan 29 03:38:49 crc kubenswrapper[4707]: I0129 03:38:49.055067 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerStarted","Data":"3bbdd788db96087dfedbc34ed731083e8f6f8715e5b8cac92b96be1564546b72"} Jan 29 03:38:49 crc kubenswrapper[4707]: I0129 03:38:49.055077 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerStarted","Data":"aee1396debd96df34aa90f705fccb1eaae0ae60710b91988de754d5effc642a0"} Jan 29 03:38:49 crc kubenswrapper[4707]: I0129 03:38:49.055090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerStarted","Data":"cbb5e1f29e928e363cfebbde2961f25d3442332683aca16436d8a602bcf3c206"} Jan 29 03:38:49 crc kubenswrapper[4707]: I0129 03:38:49.255381 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3eccef7-1d8e-42b5-b7c8-2cd378b7465a" path="/var/lib/kubelet/pods/f3eccef7-1d8e-42b5-b7c8-2cd378b7465a/volumes" Jan 29 03:38:52 crc kubenswrapper[4707]: I0129 03:38:52.087411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerStarted","Data":"4bed204066af296c3bfdba5607167d13a0a9a56d6d2c7863c8dc318073863af2"} Jan 29 03:38:54 crc kubenswrapper[4707]: I0129 03:38:54.115375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" event={"ID":"a37a4cfa-df5d-4086-9de6-5d8bcec0772a","Type":"ContainerStarted","Data":"85629cac143f59a05959282a4b18771ba4a3b28531b514bf4d8eb9a563655077"} Jan 29 03:38:54 crc kubenswrapper[4707]: I0129 03:38:54.117611 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:54 crc kubenswrapper[4707]: I0129 03:38:54.117649 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:54 crc kubenswrapper[4707]: I0129 03:38:54.117706 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:54 crc kubenswrapper[4707]: I0129 03:38:54.212519 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:54 crc kubenswrapper[4707]: I0129 03:38:54.212891 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:38:54 crc kubenswrapper[4707]: I0129 03:38:54.246297 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" podStartSLOduration=7.246278301 podStartE2EDuration="7.246278301s" podCreationTimestamp="2026-01-29 03:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:38:54.172403023 +0000 UTC m=+687.656631928" watchObservedRunningTime="2026-01-29 03:38:54.246278301 +0000 UTC m=+687.730507206" Jan 29 03:39:02 crc kubenswrapper[4707]: I0129 03:39:02.244136 4707 scope.go:117] "RemoveContainer" containerID="5655bbc11ea24b509e78271170c1b3b66ff0b6788c59aa6680676258a96736b3" Jan 29 03:39:02 crc kubenswrapper[4707]: E0129 03:39:02.245169 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vh9xt_openshift-multus(bd938209-46da-4f33-8496-23beb193ac96)\"" pod="openshift-multus/multus-vh9xt" podUID="bd938209-46da-4f33-8496-23beb193ac96" Jan 29 03:39:03 crc kubenswrapper[4707]: I0129 03:39:03.463081 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:39:03 crc kubenswrapper[4707]: I0129 03:39:03.463170 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:39:13 crc kubenswrapper[4707]: I0129 03:39:13.243865 4707 scope.go:117] "RemoveContainer" containerID="5655bbc11ea24b509e78271170c1b3b66ff0b6788c59aa6680676258a96736b3" Jan 29 03:39:14 crc kubenswrapper[4707]: I0129 03:39:14.265291 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vh9xt_bd938209-46da-4f33-8496-23beb193ac96/kube-multus/2.log" Jan 29 03:39:14 crc kubenswrapper[4707]: I0129 03:39:14.265819 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vh9xt" event={"ID":"bd938209-46da-4f33-8496-23beb193ac96","Type":"ContainerStarted","Data":"407942d36767e9d88a3a5ba147c264eb2ca0437b955ed486f25bf695da90a4b9"} Jan 29 03:39:17 crc kubenswrapper[4707]: I0129 03:39:17.464168 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x7n4x" Jan 29 03:39:21 crc kubenswrapper[4707]: I0129 03:39:21.987633 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn"] Jan 29 03:39:21 crc kubenswrapper[4707]: I0129 03:39:21.989164 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:21 crc kubenswrapper[4707]: I0129 03:39:21.993207 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.000558 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn"] Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.114656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8pm2\" (UniqueName: \"kubernetes.io/projected/326126f2-a0ee-40f0-9bf9-82dc8f430539-kube-api-access-s8pm2\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.114782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.114848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.216480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8pm2\" (UniqueName: \"kubernetes.io/projected/326126f2-a0ee-40f0-9bf9-82dc8f430539-kube-api-access-s8pm2\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.216581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.216616 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.217199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.217487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.236375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8pm2\" (UniqueName: \"kubernetes.io/projected/326126f2-a0ee-40f0-9bf9-82dc8f430539-kube-api-access-s8pm2\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.315232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:22 crc kubenswrapper[4707]: I0129 03:39:22.493991 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn"] Jan 29 03:39:23 crc kubenswrapper[4707]: I0129 03:39:23.324368 4707 generic.go:334] "Generic (PLEG): container finished" podID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerID="f5233eff5e00130ec22c2920e37c5b2c714c7e36b7a564ca87e39f4e090ab472" exitCode=0 Jan 29 03:39:23 crc kubenswrapper[4707]: I0129 03:39:23.324438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" event={"ID":"326126f2-a0ee-40f0-9bf9-82dc8f430539","Type":"ContainerDied","Data":"f5233eff5e00130ec22c2920e37c5b2c714c7e36b7a564ca87e39f4e090ab472"} Jan 29 03:39:23 crc kubenswrapper[4707]: I0129 03:39:23.324476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" event={"ID":"326126f2-a0ee-40f0-9bf9-82dc8f430539","Type":"ContainerStarted","Data":"04c7b2dfb4f2acc7d5e1f66486bbeea1dde7b05f520f69a2de0b674d09844170"} Jan 29 03:39:25 crc kubenswrapper[4707]: I0129 03:39:25.342043 4707 generic.go:334] "Generic (PLEG): container finished" podID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerID="6fed3964b89dbabb269080ae3ba070514b522db44929c01a8b2f5114bc88cdda" exitCode=0 Jan 29 03:39:25 crc kubenswrapper[4707]: I0129 03:39:25.342146 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" event={"ID":"326126f2-a0ee-40f0-9bf9-82dc8f430539","Type":"ContainerDied","Data":"6fed3964b89dbabb269080ae3ba070514b522db44929c01a8b2f5114bc88cdda"} Jan 29 03:39:26 crc kubenswrapper[4707]: I0129 03:39:26.352205 4707 generic.go:334] "Generic (PLEG): container finished" podID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerID="d5403790bdc3925a446242df5d1ff31a4e5571e491853263a4d95706fc5aa20c" exitCode=0 Jan 29 03:39:26 crc kubenswrapper[4707]: I0129 03:39:26.352262 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" event={"ID":"326126f2-a0ee-40f0-9bf9-82dc8f430539","Type":"ContainerDied","Data":"d5403790bdc3925a446242df5d1ff31a4e5571e491853263a4d95706fc5aa20c"} Jan 29 03:39:27 crc kubenswrapper[4707]: I0129 03:39:27.658307 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:27 crc kubenswrapper[4707]: I0129 03:39:27.827136 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-util\") pod \"326126f2-a0ee-40f0-9bf9-82dc8f430539\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " Jan 29 03:39:27 crc kubenswrapper[4707]: I0129 03:39:27.827987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8pm2\" (UniqueName: \"kubernetes.io/projected/326126f2-a0ee-40f0-9bf9-82dc8f430539-kube-api-access-s8pm2\") pod \"326126f2-a0ee-40f0-9bf9-82dc8f430539\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " Jan 29 03:39:27 crc kubenswrapper[4707]: I0129 03:39:27.828213 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-bundle\") pod \"326126f2-a0ee-40f0-9bf9-82dc8f430539\" (UID: \"326126f2-a0ee-40f0-9bf9-82dc8f430539\") " Jan 29 03:39:27 crc kubenswrapper[4707]: I0129 03:39:27.830075 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-bundle" (OuterVolumeSpecName: "bundle") pod "326126f2-a0ee-40f0-9bf9-82dc8f430539" (UID: "326126f2-a0ee-40f0-9bf9-82dc8f430539"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:39:27 crc kubenswrapper[4707]: I0129 03:39:27.837952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/326126f2-a0ee-40f0-9bf9-82dc8f430539-kube-api-access-s8pm2" (OuterVolumeSpecName: "kube-api-access-s8pm2") pod "326126f2-a0ee-40f0-9bf9-82dc8f430539" (UID: "326126f2-a0ee-40f0-9bf9-82dc8f430539"). InnerVolumeSpecName "kube-api-access-s8pm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:39:27 crc kubenswrapper[4707]: I0129 03:39:27.930449 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8pm2\" (UniqueName: \"kubernetes.io/projected/326126f2-a0ee-40f0-9bf9-82dc8f430539-kube-api-access-s8pm2\") on node \"crc\" DevicePath \"\"" Jan 29 03:39:27 crc kubenswrapper[4707]: I0129 03:39:27.930514 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:39:28 crc kubenswrapper[4707]: I0129 03:39:28.064918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-util" (OuterVolumeSpecName: "util") pod "326126f2-a0ee-40f0-9bf9-82dc8f430539" (UID: "326126f2-a0ee-40f0-9bf9-82dc8f430539"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:39:28 crc kubenswrapper[4707]: I0129 03:39:28.134728 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/326126f2-a0ee-40f0-9bf9-82dc8f430539-util\") on node \"crc\" DevicePath \"\"" Jan 29 03:39:28 crc kubenswrapper[4707]: I0129 03:39:28.371691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" event={"ID":"326126f2-a0ee-40f0-9bf9-82dc8f430539","Type":"ContainerDied","Data":"04c7b2dfb4f2acc7d5e1f66486bbeea1dde7b05f520f69a2de0b674d09844170"} Jan 29 03:39:28 crc kubenswrapper[4707]: I0129 03:39:28.371771 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c7b2dfb4f2acc7d5e1f66486bbeea1dde7b05f520f69a2de0b674d09844170" Jan 29 03:39:28 crc kubenswrapper[4707]: I0129 03:39:28.371830 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.626689 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-6497v"] Jan 29 03:39:29 crc kubenswrapper[4707]: E0129 03:39:29.627014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerName="util" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.627033 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerName="util" Jan 29 03:39:29 crc kubenswrapper[4707]: E0129 03:39:29.627060 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerName="extract" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.627068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerName="extract" Jan 29 03:39:29 crc kubenswrapper[4707]: E0129 03:39:29.627081 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerName="pull" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.627088 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerName="pull" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.627220 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="326126f2-a0ee-40f0-9bf9-82dc8f430539" containerName="extract" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.627808 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-6497v" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.629926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-5cwk5" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.630135 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.630310 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.643566 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-6497v"] Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.759247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gfs\" (UniqueName: \"kubernetes.io/projected/172d1247-a499-49cf-a003-3c70d059385f-kube-api-access-52gfs\") pod \"nmstate-operator-646758c888-6497v\" (UID: \"172d1247-a499-49cf-a003-3c70d059385f\") " pod="openshift-nmstate/nmstate-operator-646758c888-6497v" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.860365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52gfs\" (UniqueName: \"kubernetes.io/projected/172d1247-a499-49cf-a003-3c70d059385f-kube-api-access-52gfs\") pod \"nmstate-operator-646758c888-6497v\" (UID: \"172d1247-a499-49cf-a003-3c70d059385f\") " pod="openshift-nmstate/nmstate-operator-646758c888-6497v" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.879745 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gfs\" (UniqueName: \"kubernetes.io/projected/172d1247-a499-49cf-a003-3c70d059385f-kube-api-access-52gfs\") pod \"nmstate-operator-646758c888-6497v\" (UID: \"172d1247-a499-49cf-a003-3c70d059385f\") " pod="openshift-nmstate/nmstate-operator-646758c888-6497v" Jan 29 03:39:29 crc kubenswrapper[4707]: I0129 03:39:29.952402 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-6497v" Jan 29 03:39:30 crc kubenswrapper[4707]: I0129 03:39:30.269701 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-6497v"] Jan 29 03:39:30 crc kubenswrapper[4707]: I0129 03:39:30.384442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-6497v" event={"ID":"172d1247-a499-49cf-a003-3c70d059385f","Type":"ContainerStarted","Data":"f91abb45e4f8990711a8e74d4bf7b9626a044922d1170bb135ca7da74670f937"} Jan 29 03:39:33 crc kubenswrapper[4707]: I0129 03:39:33.406851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-6497v" event={"ID":"172d1247-a499-49cf-a003-3c70d059385f","Type":"ContainerStarted","Data":"b99cd0f43ad42d08d29957d1645199bf3b6faf082be550459d8f5cb9a303a442"} Jan 29 03:39:33 crc kubenswrapper[4707]: I0129 03:39:33.425827 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-6497v" podStartSLOduration=2.46869653 podStartE2EDuration="4.425803413s" podCreationTimestamp="2026-01-29 03:39:29 +0000 UTC" firstStartedPulling="2026-01-29 03:39:30.281415166 +0000 UTC m=+723.765644071" lastFinishedPulling="2026-01-29 03:39:32.238522049 +0000 UTC m=+725.722750954" observedRunningTime="2026-01-29 03:39:33.422012717 +0000 UTC m=+726.906241642" watchObservedRunningTime="2026-01-29 03:39:33.425803413 +0000 UTC m=+726.910032318" Jan 29 03:39:33 crc kubenswrapper[4707]: I0129 03:39:33.463726 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:39:33 crc kubenswrapper[4707]: I0129 03:39:33.463834 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.472505 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kdqjs"] Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.473953 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-kdqjs" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.476286 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-w2bnr" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.481644 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kdqjs"] Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.492481 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l"] Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.493559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.495892 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.510327 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xvfr7"] Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.511418 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.513352 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l"] Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.527043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9pj\" (UniqueName: \"kubernetes.io/projected/35c9d599-ff9e-4713-8c0d-6e72c41f6859-kube-api-access-9h9pj\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.527168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/35c9d599-ff9e-4713-8c0d-6e72c41f6859-nmstate-lock\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.527287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/35c9d599-ff9e-4713-8c0d-6e72c41f6859-ovs-socket\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.527465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kgvm\" (UniqueName: \"kubernetes.io/projected/7eec8389-133d-412b-a2f6-813eaf6e6468-kube-api-access-8kgvm\") pod \"nmstate-webhook-8474b5b9d8-7rh4l\" (UID: \"7eec8389-133d-412b-a2f6-813eaf6e6468\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.527604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7eec8389-133d-412b-a2f6-813eaf6e6468-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7rh4l\" (UID: \"7eec8389-133d-412b-a2f6-813eaf6e6468\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.527720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/35c9d599-ff9e-4713-8c0d-6e72c41f6859-dbus-socket\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.527766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gvnr\" (UniqueName: \"kubernetes.io/projected/966ffde7-06ec-4066-b9db-b4b1e750095f-kube-api-access-2gvnr\") pod \"nmstate-metrics-54757c584b-kdqjs\" (UID: \"966ffde7-06ec-4066-b9db-b4b1e750095f\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kdqjs" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.628683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9pj\" (UniqueName: \"kubernetes.io/projected/35c9d599-ff9e-4713-8c0d-6e72c41f6859-kube-api-access-9h9pj\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.628752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/35c9d599-ff9e-4713-8c0d-6e72c41f6859-nmstate-lock\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.628781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/35c9d599-ff9e-4713-8c0d-6e72c41f6859-ovs-socket\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.628810 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kgvm\" (UniqueName: \"kubernetes.io/projected/7eec8389-133d-412b-a2f6-813eaf6e6468-kube-api-access-8kgvm\") pod \"nmstate-webhook-8474b5b9d8-7rh4l\" (UID: \"7eec8389-133d-412b-a2f6-813eaf6e6468\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.628832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7eec8389-133d-412b-a2f6-813eaf6e6468-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7rh4l\" (UID: \"7eec8389-133d-412b-a2f6-813eaf6e6468\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.628859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/35c9d599-ff9e-4713-8c0d-6e72c41f6859-dbus-socket\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.628883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gvnr\" (UniqueName: \"kubernetes.io/projected/966ffde7-06ec-4066-b9db-b4b1e750095f-kube-api-access-2gvnr\") pod \"nmstate-metrics-54757c584b-kdqjs\" (UID: \"966ffde7-06ec-4066-b9db-b4b1e750095f\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kdqjs" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.628934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/35c9d599-ff9e-4713-8c0d-6e72c41f6859-nmstate-lock\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.628961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/35c9d599-ff9e-4713-8c0d-6e72c41f6859-ovs-socket\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: E0129 03:39:34.629115 4707 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 29 03:39:34 crc kubenswrapper[4707]: E0129 03:39:34.629243 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eec8389-133d-412b-a2f6-813eaf6e6468-tls-key-pair podName:7eec8389-133d-412b-a2f6-813eaf6e6468 nodeName:}" failed. No retries permitted until 2026-01-29 03:39:35.129212538 +0000 UTC m=+728.613441443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/7eec8389-133d-412b-a2f6-813eaf6e6468-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-7rh4l" (UID: "7eec8389-133d-412b-a2f6-813eaf6e6468") : secret "openshift-nmstate-webhook" not found Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.629443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/35c9d599-ff9e-4713-8c0d-6e72c41f6859-dbus-socket\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.651885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9pj\" (UniqueName: \"kubernetes.io/projected/35c9d599-ff9e-4713-8c0d-6e72c41f6859-kube-api-access-9h9pj\") pod \"nmstate-handler-xvfr7\" (UID: \"35c9d599-ff9e-4713-8c0d-6e72c41f6859\") " pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.651977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gvnr\" (UniqueName: \"kubernetes.io/projected/966ffde7-06ec-4066-b9db-b4b1e750095f-kube-api-access-2gvnr\") pod \"nmstate-metrics-54757c584b-kdqjs\" (UID: \"966ffde7-06ec-4066-b9db-b4b1e750095f\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kdqjs" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.666320 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz"] Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.667219 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.684731 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-pzcjc" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.685755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kgvm\" (UniqueName: \"kubernetes.io/projected/7eec8389-133d-412b-a2f6-813eaf6e6468-kube-api-access-8kgvm\") pod \"nmstate-webhook-8474b5b9d8-7rh4l\" (UID: \"7eec8389-133d-412b-a2f6-813eaf6e6468\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.686119 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.686197 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.713310 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz"] Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.791998 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-kdqjs" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.832269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4657ed73-851c-43e4-9f85-e06471c81722-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.832390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4657ed73-851c-43e4-9f85-e06471c81722-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.832430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgptt\" (UniqueName: \"kubernetes.io/projected/4657ed73-851c-43e4-9f85-e06471c81722-kube-api-access-kgptt\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.872999 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:34 crc kubenswrapper[4707]: W0129 03:39:34.922158 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c9d599_ff9e_4713_8c0d_6e72c41f6859.slice/crio-cd9845c694f87e4dc47318bb7a78b95f12e8bbe263705653b97495b3d5978049 WatchSource:0}: Error finding container cd9845c694f87e4dc47318bb7a78b95f12e8bbe263705653b97495b3d5978049: Status 404 returned error can't find the container with id cd9845c694f87e4dc47318bb7a78b95f12e8bbe263705653b97495b3d5978049 Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.933477 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4657ed73-851c-43e4-9f85-e06471c81722-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.933873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4657ed73-851c-43e4-9f85-e06471c81722-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.933901 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgptt\" (UniqueName: \"kubernetes.io/projected/4657ed73-851c-43e4-9f85-e06471c81722-kube-api-access-kgptt\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.935105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4657ed73-851c-43e4-9f85-e06471c81722-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:34 crc kubenswrapper[4707]: E0129 03:39:34.935181 4707 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 29 03:39:34 crc kubenswrapper[4707]: E0129 03:39:34.935225 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4657ed73-851c-43e4-9f85-e06471c81722-plugin-serving-cert podName:4657ed73-851c-43e4-9f85-e06471c81722 nodeName:}" failed. No retries permitted until 2026-01-29 03:39:35.435212416 +0000 UTC m=+728.919441321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/4657ed73-851c-43e4-9f85-e06471c81722-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-rkzwz" (UID: "4657ed73-851c-43e4-9f85-e06471c81722") : secret "plugin-serving-cert" not found Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.958909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgptt\" (UniqueName: \"kubernetes.io/projected/4657ed73-851c-43e4-9f85-e06471c81722-kube-api-access-kgptt\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.980466 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7485dd7847-z2mfn"] Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.981605 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:34 crc kubenswrapper[4707]: I0129 03:39:34.994359 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7485dd7847-z2mfn"] Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.077883 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kdqjs"] Jan 29 03:39:35 crc kubenswrapper[4707]: W0129 03:39:35.089994 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod966ffde7_06ec_4066_b9db_b4b1e750095f.slice/crio-6bf24d87ac61cca437ea17be02c07d7e4919520bcb8493104ebacf4e0c2f49a0 WatchSource:0}: Error finding container 6bf24d87ac61cca437ea17be02c07d7e4919520bcb8493104ebacf4e0c2f49a0: Status 404 returned error can't find the container with id 6bf24d87ac61cca437ea17be02c07d7e4919520bcb8493104ebacf4e0c2f49a0 Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.136694 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-service-ca\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.136748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-console-serving-cert\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.136778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2gzl\" (UniqueName: \"kubernetes.io/projected/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-kube-api-access-c2gzl\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.136829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7eec8389-133d-412b-a2f6-813eaf6e6468-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7rh4l\" (UID: \"7eec8389-133d-412b-a2f6-813eaf6e6468\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.136852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-console-oauth-config\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.136876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-trusted-ca-bundle\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.136923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-oauth-serving-cert\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.137008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-console-config\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.141414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7eec8389-133d-412b-a2f6-813eaf6e6468-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7rh4l\" (UID: \"7eec8389-133d-412b-a2f6-813eaf6e6468\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.238262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-oauth-serving-cert\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.238357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-console-config\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.238415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-service-ca\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.238450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-console-serving-cert\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.238468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gzl\" (UniqueName: \"kubernetes.io/projected/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-kube-api-access-c2gzl\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.238499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-console-oauth-config\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.238553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-trusted-ca-bundle\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.239575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-service-ca\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.240271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-trusted-ca-bundle\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.240604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-oauth-serving-cert\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.240648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-console-config\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.243741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-console-serving-cert\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.246804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-console-oauth-config\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.257396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gzl\" (UniqueName: \"kubernetes.io/projected/447f9f32-ce31-42f6-9754-c5dfaa5e9b9d-kube-api-access-c2gzl\") pod \"console-7485dd7847-z2mfn\" (UID: \"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d\") " pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.306773 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.407321 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.436069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xvfr7" event={"ID":"35c9d599-ff9e-4713-8c0d-6e72c41f6859","Type":"ContainerStarted","Data":"cd9845c694f87e4dc47318bb7a78b95f12e8bbe263705653b97495b3d5978049"} Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.438364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kdqjs" event={"ID":"966ffde7-06ec-4066-b9db-b4b1e750095f","Type":"ContainerStarted","Data":"6bf24d87ac61cca437ea17be02c07d7e4919520bcb8493104ebacf4e0c2f49a0"} Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.441414 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4657ed73-851c-43e4-9f85-e06471c81722-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.450433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4657ed73-851c-43e4-9f85-e06471c81722-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-rkzwz\" (UID: \"4657ed73-851c-43e4-9f85-e06471c81722\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.568418 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7485dd7847-z2mfn"] Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.643352 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.737191 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l"] Jan 29 03:39:35 crc kubenswrapper[4707]: I0129 03:39:35.893550 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz"] Jan 29 03:39:35 crc kubenswrapper[4707]: W0129 03:39:35.899344 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4657ed73_851c_43e4_9f85_e06471c81722.slice/crio-e0276ebf214c640c6332743da9712da357a7c4c17ff497485d22c80b9ee253cc WatchSource:0}: Error finding container e0276ebf214c640c6332743da9712da357a7c4c17ff497485d22c80b9ee253cc: Status 404 returned error can't find the container with id e0276ebf214c640c6332743da9712da357a7c4c17ff497485d22c80b9ee253cc Jan 29 03:39:36 crc kubenswrapper[4707]: I0129 03:39:36.447631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" event={"ID":"7eec8389-133d-412b-a2f6-813eaf6e6468","Type":"ContainerStarted","Data":"1e5bc52b12ca9db3aae7bb1d82067c818166d9d734aaa0bad40cdc0766b375b9"} Jan 29 03:39:36 crc kubenswrapper[4707]: I0129 03:39:36.450106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" event={"ID":"4657ed73-851c-43e4-9f85-e06471c81722","Type":"ContainerStarted","Data":"e0276ebf214c640c6332743da9712da357a7c4c17ff497485d22c80b9ee253cc"} Jan 29 03:39:36 crc kubenswrapper[4707]: I0129 03:39:36.452045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7485dd7847-z2mfn" event={"ID":"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d","Type":"ContainerStarted","Data":"5fdf3204610738f6b291dbde012a6226d4eec5a377a8fbf6afa6d70fa91b7ca3"} Jan 29 03:39:36 crc kubenswrapper[4707]: I0129 03:39:36.452081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7485dd7847-z2mfn" event={"ID":"447f9f32-ce31-42f6-9754-c5dfaa5e9b9d","Type":"ContainerStarted","Data":"ddbdc5950b7a3ac6a4f6ffa80b97c76c6fd485a7ba03f7a369c3eea4d033123f"} Jan 29 03:39:36 crc kubenswrapper[4707]: I0129 03:39:36.479394 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7485dd7847-z2mfn" podStartSLOduration=2.479362411 podStartE2EDuration="2.479362411s" podCreationTimestamp="2026-01-29 03:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:39:36.475090118 +0000 UTC m=+729.959319023" watchObservedRunningTime="2026-01-29 03:39:36.479362411 +0000 UTC m=+729.963591306" Jan 29 03:39:38 crc kubenswrapper[4707]: I0129 03:39:38.468960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" event={"ID":"7eec8389-133d-412b-a2f6-813eaf6e6468","Type":"ContainerStarted","Data":"5da06bf72fc221ab345b6920a2f625b4755fb370ce05f90de353cccda5f0c157"} Jan 29 03:39:38 crc kubenswrapper[4707]: I0129 03:39:38.470759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:39:38 crc kubenswrapper[4707]: I0129 03:39:38.470788 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:38 crc kubenswrapper[4707]: I0129 03:39:38.470802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xvfr7" event={"ID":"35c9d599-ff9e-4713-8c0d-6e72c41f6859","Type":"ContainerStarted","Data":"d5edff1abf6b749b74bbd5c12421233568358fe4e5c42f9e91c8b109c576003a"} Jan 29 03:39:38 crc kubenswrapper[4707]: I0129 03:39:38.475356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kdqjs" event={"ID":"966ffde7-06ec-4066-b9db-b4b1e750095f","Type":"ContainerStarted","Data":"be7e5abf5ff61f21bfc22ce84de27fb8effbca23080b06540f96a0ff29d37401"} Jan 29 03:39:38 crc kubenswrapper[4707]: I0129 03:39:38.487463 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" podStartSLOduration=2.529601756 podStartE2EDuration="4.487411473s" podCreationTimestamp="2026-01-29 03:39:34 +0000 UTC" firstStartedPulling="2026-01-29 03:39:35.76759649 +0000 UTC m=+729.251825395" lastFinishedPulling="2026-01-29 03:39:37.725406207 +0000 UTC m=+731.209635112" observedRunningTime="2026-01-29 03:39:38.486028412 +0000 UTC m=+731.970257337" watchObservedRunningTime="2026-01-29 03:39:38.487411473 +0000 UTC m=+731.971640378" Jan 29 03:39:38 crc kubenswrapper[4707]: I0129 03:39:38.513992 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xvfr7" podStartSLOduration=1.702686688 podStartE2EDuration="4.513963806s" podCreationTimestamp="2026-01-29 03:39:34 +0000 UTC" firstStartedPulling="2026-01-29 03:39:34.928809107 +0000 UTC m=+728.413038012" lastFinishedPulling="2026-01-29 03:39:37.740086205 +0000 UTC m=+731.224315130" observedRunningTime="2026-01-29 03:39:38.506415384 +0000 UTC m=+731.990644299" watchObservedRunningTime="2026-01-29 03:39:38.513963806 +0000 UTC m=+731.998192731" Jan 29 03:39:39 crc kubenswrapper[4707]: I0129 03:39:39.484210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" event={"ID":"4657ed73-851c-43e4-9f85-e06471c81722","Type":"ContainerStarted","Data":"c8cd4191da0d43dab298237081badcc5a0399124c9d18a8012d765b3e7fc0578"} Jan 29 03:39:39 crc kubenswrapper[4707]: I0129 03:39:39.507803 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-rkzwz" podStartSLOduration=2.681599172 podStartE2EDuration="5.507756677s" podCreationTimestamp="2026-01-29 03:39:34 +0000 UTC" firstStartedPulling="2026-01-29 03:39:35.902828251 +0000 UTC m=+729.387057156" lastFinishedPulling="2026-01-29 03:39:38.728985756 +0000 UTC m=+732.213214661" observedRunningTime="2026-01-29 03:39:39.502398291 +0000 UTC m=+732.986627196" watchObservedRunningTime="2026-01-29 03:39:39.507756677 +0000 UTC m=+732.991985572" Jan 29 03:39:40 crc kubenswrapper[4707]: I0129 03:39:40.495317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kdqjs" event={"ID":"966ffde7-06ec-4066-b9db-b4b1e750095f","Type":"ContainerStarted","Data":"a786c9a4713db81fa2c4e1234e0cb50bdfb7857193170eec82d3be5aa16a99be"} Jan 29 03:39:40 crc kubenswrapper[4707]: I0129 03:39:40.522039 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-kdqjs" podStartSLOduration=1.611022852 podStartE2EDuration="6.522008858s" podCreationTimestamp="2026-01-29 03:39:34 +0000 UTC" firstStartedPulling="2026-01-29 03:39:35.095607249 +0000 UTC m=+728.579836154" lastFinishedPulling="2026-01-29 03:39:40.006593255 +0000 UTC m=+733.490822160" observedRunningTime="2026-01-29 03:39:40.516160275 +0000 UTC m=+734.000389240" watchObservedRunningTime="2026-01-29 03:39:40.522008858 +0000 UTC m=+734.006237763" Jan 29 03:39:44 crc kubenswrapper[4707]: I0129 03:39:44.902940 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xvfr7" Jan 29 03:39:45 crc kubenswrapper[4707]: I0129 03:39:45.307231 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:45 crc kubenswrapper[4707]: I0129 03:39:45.307288 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:45 crc kubenswrapper[4707]: I0129 03:39:45.312321 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:45 crc kubenswrapper[4707]: I0129 03:39:45.531861 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7485dd7847-z2mfn" Jan 29 03:39:45 crc kubenswrapper[4707]: I0129 03:39:45.580906 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hp957"] Jan 29 03:39:55 crc kubenswrapper[4707]: I0129 03:39:55.413933 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7rh4l" Jan 29 03:40:03 crc kubenswrapper[4707]: I0129 03:40:03.463310 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:40:03 crc kubenswrapper[4707]: I0129 03:40:03.464435 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:40:03 crc kubenswrapper[4707]: I0129 03:40:03.464515 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:40:03 crc kubenswrapper[4707]: I0129 03:40:03.465632 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fdbfa93f1cdcabdbaa105826dff7f462fdab9740abd38ddc05a6f8c6801cf011"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 03:40:03 crc kubenswrapper[4707]: I0129 03:40:03.465775 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://fdbfa93f1cdcabdbaa105826dff7f462fdab9740abd38ddc05a6f8c6801cf011" gracePeriod=600 Jan 29 03:40:03 crc kubenswrapper[4707]: I0129 03:40:03.670324 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="fdbfa93f1cdcabdbaa105826dff7f462fdab9740abd38ddc05a6f8c6801cf011" exitCode=0 Jan 29 03:40:03 crc kubenswrapper[4707]: I0129 03:40:03.670626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"fdbfa93f1cdcabdbaa105826dff7f462fdab9740abd38ddc05a6f8c6801cf011"} Jan 29 03:40:03 crc kubenswrapper[4707]: I0129 03:40:03.671037 4707 scope.go:117] "RemoveContainer" containerID="7d626ab43efa55857c89f32b874520ad65ab395b5a2359de01e47becbb927c08" Jan 29 03:40:04 crc kubenswrapper[4707]: I0129 03:40:04.684140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"db143d5776abf0f9e4af062dd3ffe22d1bdacd65eb8ea86d4728ea8e0ca0f327"} Jan 29 03:40:10 crc kubenswrapper[4707]: I0129 03:40:10.631194 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hp957" podUID="53bfd3ca-7447-44cf-af4c-165db1f5e7be" containerName="console" containerID="cri-o://22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc" gracePeriod=15 Jan 29 03:40:10 crc kubenswrapper[4707]: I0129 03:40:10.823252 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs"] Jan 29 03:40:10 crc kubenswrapper[4707]: I0129 03:40:10.824405 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:10 crc kubenswrapper[4707]: I0129 03:40:10.826753 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 03:40:10 crc kubenswrapper[4707]: I0129 03:40:10.862046 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs"] Jan 29 03:40:10 crc kubenswrapper[4707]: I0129 03:40:10.943169 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:10 crc kubenswrapper[4707]: I0129 03:40:10.943255 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6v68\" (UniqueName: \"kubernetes.io/projected/814f6017-596c-4a1b-87c6-78a2d013cec2-kube-api-access-g6v68\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:10 crc kubenswrapper[4707]: I0129 03:40:10.943287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.035937 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hp957_53bfd3ca-7447-44cf-af4c-165db1f5e7be/console/0.log" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.036297 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.045022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.045083 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6v68\" (UniqueName: \"kubernetes.io/projected/814f6017-596c-4a1b-87c6-78a2d013cec2-kube-api-access-g6v68\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.045111 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.045653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.045746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.068446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6v68\" (UniqueName: \"kubernetes.io/projected/814f6017-596c-4a1b-87c6-78a2d013cec2-kube-api-access-g6v68\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.143299 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.146320 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-config\") pod \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.146396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsfwg\" (UniqueName: \"kubernetes.io/projected/53bfd3ca-7447-44cf-af4c-165db1f5e7be-kube-api-access-wsfwg\") pod \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.146504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-serving-cert\") pod \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.146577 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-oauth-config\") pod \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.146619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-service-ca\") pod \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.146643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-trusted-ca-bundle\") pod \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.146673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-oauth-serving-cert\") pod \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\" (UID: \"53bfd3ca-7447-44cf-af4c-165db1f5e7be\") " Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.147514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-config" (OuterVolumeSpecName: "console-config") pod "53bfd3ca-7447-44cf-af4c-165db1f5e7be" (UID: "53bfd3ca-7447-44cf-af4c-165db1f5e7be"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.147759 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-service-ca" (OuterVolumeSpecName: "service-ca") pod "53bfd3ca-7447-44cf-af4c-165db1f5e7be" (UID: "53bfd3ca-7447-44cf-af4c-165db1f5e7be"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.147775 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "53bfd3ca-7447-44cf-af4c-165db1f5e7be" (UID: "53bfd3ca-7447-44cf-af4c-165db1f5e7be"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.147882 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "53bfd3ca-7447-44cf-af4c-165db1f5e7be" (UID: "53bfd3ca-7447-44cf-af4c-165db1f5e7be"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.151249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "53bfd3ca-7447-44cf-af4c-165db1f5e7be" (UID: "53bfd3ca-7447-44cf-af4c-165db1f5e7be"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.151347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bfd3ca-7447-44cf-af4c-165db1f5e7be-kube-api-access-wsfwg" (OuterVolumeSpecName: "kube-api-access-wsfwg") pod "53bfd3ca-7447-44cf-af4c-165db1f5e7be" (UID: "53bfd3ca-7447-44cf-af4c-165db1f5e7be"). InnerVolumeSpecName "kube-api-access-wsfwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.151747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "53bfd3ca-7447-44cf-af4c-165db1f5e7be" (UID: "53bfd3ca-7447-44cf-af4c-165db1f5e7be"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.248285 4707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.248329 4707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.248339 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.248350 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.248364 4707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.248376 4707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53bfd3ca-7447-44cf-af4c-165db1f5e7be-console-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.248386 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsfwg\" (UniqueName: \"kubernetes.io/projected/53bfd3ca-7447-44cf-af4c-165db1f5e7be-kube-api-access-wsfwg\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.547503 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs"] Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.733068 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hp957_53bfd3ca-7447-44cf-af4c-165db1f5e7be/console/0.log" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.733517 4707 generic.go:334] "Generic (PLEG): container finished" podID="53bfd3ca-7447-44cf-af4c-165db1f5e7be" containerID="22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc" exitCode=2 Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.733607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hp957" event={"ID":"53bfd3ca-7447-44cf-af4c-165db1f5e7be","Type":"ContainerDied","Data":"22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc"} Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.733646 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hp957" event={"ID":"53bfd3ca-7447-44cf-af4c-165db1f5e7be","Type":"ContainerDied","Data":"1c662ee3a3d13614db59cd89e1ff86b9243dbc4c17240534528ff6d9614b3cd8"} Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.733670 4707 scope.go:117] "RemoveContainer" containerID="22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.733824 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hp957" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.738489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" event={"ID":"814f6017-596c-4a1b-87c6-78a2d013cec2","Type":"ContainerStarted","Data":"b9cf5f4346169d1472a4a2a46411fad29b8e82a446fe67c8b405582d5317367d"} Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.738533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" event={"ID":"814f6017-596c-4a1b-87c6-78a2d013cec2","Type":"ContainerStarted","Data":"f44b926ff509539e2e9d2a0702b47a999341a2f6a0206e80fb23b38d01ed17ab"} Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.784070 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hp957"] Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.789001 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hp957"] Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.790607 4707 scope.go:117] "RemoveContainer" containerID="22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc" Jan 29 03:40:11 crc kubenswrapper[4707]: E0129 03:40:11.791397 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc\": container with ID starting with 22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc not found: ID does not exist" containerID="22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc" Jan 29 03:40:11 crc kubenswrapper[4707]: I0129 03:40:11.791613 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc"} err="failed to get container status \"22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc\": rpc error: code = NotFound desc = could not find container \"22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc\": container with ID starting with 22e3617c4633ced4494d92c7ddd5f2ba9c54e00c7aa656b0bbfda2ac241874fc not found: ID does not exist" Jan 29 03:40:12 crc kubenswrapper[4707]: I0129 03:40:12.750012 4707 generic.go:334] "Generic (PLEG): container finished" podID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerID="b9cf5f4346169d1472a4a2a46411fad29b8e82a446fe67c8b405582d5317367d" exitCode=0 Jan 29 03:40:12 crc kubenswrapper[4707]: I0129 03:40:12.750073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" event={"ID":"814f6017-596c-4a1b-87c6-78a2d013cec2","Type":"ContainerDied","Data":"b9cf5f4346169d1472a4a2a46411fad29b8e82a446fe67c8b405582d5317367d"} Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.109719 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4xh56"] Jan 29 03:40:13 crc kubenswrapper[4707]: E0129 03:40:13.110035 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bfd3ca-7447-44cf-af4c-165db1f5e7be" containerName="console" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.110050 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bfd3ca-7447-44cf-af4c-165db1f5e7be" containerName="console" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.110162 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bfd3ca-7447-44cf-af4c-165db1f5e7be" containerName="console" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.111094 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.126643 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xh56"] Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.177335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-catalog-content\") pod \"redhat-operators-4xh56\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.177750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7wc\" (UniqueName: \"kubernetes.io/projected/e5b8c571-1def-4784-8bc0-e52b484a6c1f-kube-api-access-kn7wc\") pod \"redhat-operators-4xh56\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.177997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-utilities\") pod \"redhat-operators-4xh56\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.251834 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53bfd3ca-7447-44cf-af4c-165db1f5e7be" path="/var/lib/kubelet/pods/53bfd3ca-7447-44cf-af4c-165db1f5e7be/volumes" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.279900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-catalog-content\") pod \"redhat-operators-4xh56\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.280010 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7wc\" (UniqueName: \"kubernetes.io/projected/e5b8c571-1def-4784-8bc0-e52b484a6c1f-kube-api-access-kn7wc\") pod \"redhat-operators-4xh56\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.280078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-utilities\") pod \"redhat-operators-4xh56\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.280490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-catalog-content\") pod \"redhat-operators-4xh56\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.280972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-utilities\") pod \"redhat-operators-4xh56\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.333723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7wc\" (UniqueName: \"kubernetes.io/projected/e5b8c571-1def-4784-8bc0-e52b484a6c1f-kube-api-access-kn7wc\") pod \"redhat-operators-4xh56\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.454046 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.674433 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xh56"] Jan 29 03:40:13 crc kubenswrapper[4707]: I0129 03:40:13.759572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xh56" event={"ID":"e5b8c571-1def-4784-8bc0-e52b484a6c1f","Type":"ContainerStarted","Data":"a6bca5ae06d81a3581b3c1cf2b07c3dc58ef290ccb426b7527b618de179870dc"} Jan 29 03:40:14 crc kubenswrapper[4707]: I0129 03:40:14.766968 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerID="e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631" exitCode=0 Jan 29 03:40:14 crc kubenswrapper[4707]: I0129 03:40:14.767073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xh56" event={"ID":"e5b8c571-1def-4784-8bc0-e52b484a6c1f","Type":"ContainerDied","Data":"e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631"} Jan 29 03:40:14 crc kubenswrapper[4707]: I0129 03:40:14.770130 4707 generic.go:334] "Generic (PLEG): container finished" podID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerID="f28aca4d5c9d28d02382699e830266ee2bfb874da84b4096c6950bd3ce6b3ba7" exitCode=0 Jan 29 03:40:14 crc kubenswrapper[4707]: I0129 03:40:14.770388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" event={"ID":"814f6017-596c-4a1b-87c6-78a2d013cec2","Type":"ContainerDied","Data":"f28aca4d5c9d28d02382699e830266ee2bfb874da84b4096c6950bd3ce6b3ba7"} Jan 29 03:40:15 crc kubenswrapper[4707]: I0129 03:40:15.778687 4707 generic.go:334] "Generic (PLEG): container finished" podID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerID="9a09364fe95a3f1261a746871a4cda7661bb83527bdf46aa76aecdd3f917aac0" exitCode=0 Jan 29 03:40:15 crc kubenswrapper[4707]: I0129 03:40:15.778786 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" event={"ID":"814f6017-596c-4a1b-87c6-78a2d013cec2","Type":"ContainerDied","Data":"9a09364fe95a3f1261a746871a4cda7661bb83527bdf46aa76aecdd3f917aac0"} Jan 29 03:40:15 crc kubenswrapper[4707]: I0129 03:40:15.781481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xh56" event={"ID":"e5b8c571-1def-4784-8bc0-e52b484a6c1f","Type":"ContainerStarted","Data":"30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e"} Jan 29 03:40:16 crc kubenswrapper[4707]: I0129 03:40:16.794592 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerID="30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e" exitCode=0 Jan 29 03:40:16 crc kubenswrapper[4707]: I0129 03:40:16.794696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xh56" event={"ID":"e5b8c571-1def-4784-8bc0-e52b484a6c1f","Type":"ContainerDied","Data":"30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e"} Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.089143 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.132643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6v68\" (UniqueName: \"kubernetes.io/projected/814f6017-596c-4a1b-87c6-78a2d013cec2-kube-api-access-g6v68\") pod \"814f6017-596c-4a1b-87c6-78a2d013cec2\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.132802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-util\") pod \"814f6017-596c-4a1b-87c6-78a2d013cec2\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.132830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-bundle\") pod \"814f6017-596c-4a1b-87c6-78a2d013cec2\" (UID: \"814f6017-596c-4a1b-87c6-78a2d013cec2\") " Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.134054 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-bundle" (OuterVolumeSpecName: "bundle") pod "814f6017-596c-4a1b-87c6-78a2d013cec2" (UID: "814f6017-596c-4a1b-87c6-78a2d013cec2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.146682 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814f6017-596c-4a1b-87c6-78a2d013cec2-kube-api-access-g6v68" (OuterVolumeSpecName: "kube-api-access-g6v68") pod "814f6017-596c-4a1b-87c6-78a2d013cec2" (UID: "814f6017-596c-4a1b-87c6-78a2d013cec2"). InnerVolumeSpecName "kube-api-access-g6v68". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.225019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-util" (OuterVolumeSpecName: "util") pod "814f6017-596c-4a1b-87c6-78a2d013cec2" (UID: "814f6017-596c-4a1b-87c6-78a2d013cec2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.234406 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-util\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.234440 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/814f6017-596c-4a1b-87c6-78a2d013cec2-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.234456 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6v68\" (UniqueName: \"kubernetes.io/projected/814f6017-596c-4a1b-87c6-78a2d013cec2-kube-api-access-g6v68\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.805458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" event={"ID":"814f6017-596c-4a1b-87c6-78a2d013cec2","Type":"ContainerDied","Data":"f44b926ff509539e2e9d2a0702b47a999341a2f6a0206e80fb23b38d01ed17ab"} Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.805902 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f44b926ff509539e2e9d2a0702b47a999341a2f6a0206e80fb23b38d01ed17ab" Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.805492 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs" Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.808987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xh56" event={"ID":"e5b8c571-1def-4784-8bc0-e52b484a6c1f","Type":"ContainerStarted","Data":"b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299"} Jan 29 03:40:17 crc kubenswrapper[4707]: I0129 03:40:17.831505 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4xh56" podStartSLOduration=2.349957376 podStartE2EDuration="4.831482581s" podCreationTimestamp="2026-01-29 03:40:13 +0000 UTC" firstStartedPulling="2026-01-29 03:40:14.768582968 +0000 UTC m=+768.252811873" lastFinishedPulling="2026-01-29 03:40:17.250108173 +0000 UTC m=+770.734337078" observedRunningTime="2026-01-29 03:40:17.829886066 +0000 UTC m=+771.314114971" watchObservedRunningTime="2026-01-29 03:40:17.831482581 +0000 UTC m=+771.315711486" Jan 29 03:40:23 crc kubenswrapper[4707]: I0129 03:40:23.454466 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:23 crc kubenswrapper[4707]: I0129 03:40:23.456560 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:24 crc kubenswrapper[4707]: I0129 03:40:24.505320 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4xh56" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerName="registry-server" probeResult="failure" output=< Jan 29 03:40:24 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 03:40:24 crc kubenswrapper[4707]: > Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.911040 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj"] Jan 29 03:40:27 crc kubenswrapper[4707]: E0129 03:40:27.911718 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerName="extract" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.911733 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerName="extract" Jan 29 03:40:27 crc kubenswrapper[4707]: E0129 03:40:27.911750 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerName="pull" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.911758 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerName="pull" Jan 29 03:40:27 crc kubenswrapper[4707]: E0129 03:40:27.911768 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerName="util" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.911779 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerName="util" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.912078 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="814f6017-596c-4a1b-87c6-78a2d013cec2" containerName="extract" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.912699 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.917389 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.920035 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.920222 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-znw5j" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.920370 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.936163 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj"] Jan 29 03:40:27 crc kubenswrapper[4707]: I0129 03:40:27.939521 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.099079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/403318b7-a0b4-4a62-8094-9a2ac1127387-webhook-cert\") pod \"metallb-operator-controller-manager-76498b594b-4z2xj\" (UID: \"403318b7-a0b4-4a62-8094-9a2ac1127387\") " pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.099488 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6twl\" (UniqueName: \"kubernetes.io/projected/403318b7-a0b4-4a62-8094-9a2ac1127387-kube-api-access-w6twl\") pod \"metallb-operator-controller-manager-76498b594b-4z2xj\" (UID: \"403318b7-a0b4-4a62-8094-9a2ac1127387\") " pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.099770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/403318b7-a0b4-4a62-8094-9a2ac1127387-apiservice-cert\") pod \"metallb-operator-controller-manager-76498b594b-4z2xj\" (UID: \"403318b7-a0b4-4a62-8094-9a2ac1127387\") " pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.200872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/403318b7-a0b4-4a62-8094-9a2ac1127387-apiservice-cert\") pod \"metallb-operator-controller-manager-76498b594b-4z2xj\" (UID: \"403318b7-a0b4-4a62-8094-9a2ac1127387\") " pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.200992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/403318b7-a0b4-4a62-8094-9a2ac1127387-webhook-cert\") pod \"metallb-operator-controller-manager-76498b594b-4z2xj\" (UID: \"403318b7-a0b4-4a62-8094-9a2ac1127387\") " pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.201021 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6twl\" (UniqueName: \"kubernetes.io/projected/403318b7-a0b4-4a62-8094-9a2ac1127387-kube-api-access-w6twl\") pod \"metallb-operator-controller-manager-76498b594b-4z2xj\" (UID: \"403318b7-a0b4-4a62-8094-9a2ac1127387\") " pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.208251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/403318b7-a0b4-4a62-8094-9a2ac1127387-apiservice-cert\") pod \"metallb-operator-controller-manager-76498b594b-4z2xj\" (UID: \"403318b7-a0b4-4a62-8094-9a2ac1127387\") " pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.209259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/403318b7-a0b4-4a62-8094-9a2ac1127387-webhook-cert\") pod \"metallb-operator-controller-manager-76498b594b-4z2xj\" (UID: \"403318b7-a0b4-4a62-8094-9a2ac1127387\") " pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.224611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6twl\" (UniqueName: \"kubernetes.io/projected/403318b7-a0b4-4a62-8094-9a2ac1127387-kube-api-access-w6twl\") pod \"metallb-operator-controller-manager-76498b594b-4z2xj\" (UID: \"403318b7-a0b4-4a62-8094-9a2ac1127387\") " pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.235090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.406912 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj"] Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.407799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.411485 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.411878 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.413443 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hh8pc" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.452224 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj"] Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.608794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/126a1c1c-acae-407d-854d-fbeb74a88a9c-webhook-cert\") pod \"metallb-operator-webhook-server-866946c6ff-xnqlj\" (UID: \"126a1c1c-acae-407d-854d-fbeb74a88a9c\") " pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.608945 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/126a1c1c-acae-407d-854d-fbeb74a88a9c-apiservice-cert\") pod \"metallb-operator-webhook-server-866946c6ff-xnqlj\" (UID: \"126a1c1c-acae-407d-854d-fbeb74a88a9c\") " pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.608990 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7pt6\" (UniqueName: \"kubernetes.io/projected/126a1c1c-acae-407d-854d-fbeb74a88a9c-kube-api-access-r7pt6\") pod \"metallb-operator-webhook-server-866946c6ff-xnqlj\" (UID: \"126a1c1c-acae-407d-854d-fbeb74a88a9c\") " pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.688747 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj"] Jan 29 03:40:28 crc kubenswrapper[4707]: W0129 03:40:28.704931 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403318b7_a0b4_4a62_8094_9a2ac1127387.slice/crio-902e21915a0c999bb1af64baa5188e297aad08cb458b8a2fd620b8187d2b840f WatchSource:0}: Error finding container 902e21915a0c999bb1af64baa5188e297aad08cb458b8a2fd620b8187d2b840f: Status 404 returned error can't find the container with id 902e21915a0c999bb1af64baa5188e297aad08cb458b8a2fd620b8187d2b840f Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.709936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/126a1c1c-acae-407d-854d-fbeb74a88a9c-apiservice-cert\") pod \"metallb-operator-webhook-server-866946c6ff-xnqlj\" (UID: \"126a1c1c-acae-407d-854d-fbeb74a88a9c\") " pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.709999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7pt6\" (UniqueName: \"kubernetes.io/projected/126a1c1c-acae-407d-854d-fbeb74a88a9c-kube-api-access-r7pt6\") pod \"metallb-operator-webhook-server-866946c6ff-xnqlj\" (UID: \"126a1c1c-acae-407d-854d-fbeb74a88a9c\") " pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.710044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/126a1c1c-acae-407d-854d-fbeb74a88a9c-webhook-cert\") pod \"metallb-operator-webhook-server-866946c6ff-xnqlj\" (UID: \"126a1c1c-acae-407d-854d-fbeb74a88a9c\") " pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.719460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/126a1c1c-acae-407d-854d-fbeb74a88a9c-apiservice-cert\") pod \"metallb-operator-webhook-server-866946c6ff-xnqlj\" (UID: \"126a1c1c-acae-407d-854d-fbeb74a88a9c\") " pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.719498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/126a1c1c-acae-407d-854d-fbeb74a88a9c-webhook-cert\") pod \"metallb-operator-webhook-server-866946c6ff-xnqlj\" (UID: \"126a1c1c-acae-407d-854d-fbeb74a88a9c\") " pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.732169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7pt6\" (UniqueName: \"kubernetes.io/projected/126a1c1c-acae-407d-854d-fbeb74a88a9c-kube-api-access-r7pt6\") pod \"metallb-operator-webhook-server-866946c6ff-xnqlj\" (UID: \"126a1c1c-acae-407d-854d-fbeb74a88a9c\") " pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:28 crc kubenswrapper[4707]: I0129 03:40:28.875099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" event={"ID":"403318b7-a0b4-4a62-8094-9a2ac1127387","Type":"ContainerStarted","Data":"902e21915a0c999bb1af64baa5188e297aad08cb458b8a2fd620b8187d2b840f"} Jan 29 03:40:29 crc kubenswrapper[4707]: I0129 03:40:29.031945 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:29 crc kubenswrapper[4707]: I0129 03:40:29.266002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj"] Jan 29 03:40:29 crc kubenswrapper[4707]: W0129 03:40:29.274962 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod126a1c1c_acae_407d_854d_fbeb74a88a9c.slice/crio-ab0b6f83108d3fbb2861facc99359d5caca627d70c7d427e674a1cc0e5dcfdbd WatchSource:0}: Error finding container ab0b6f83108d3fbb2861facc99359d5caca627d70c7d427e674a1cc0e5dcfdbd: Status 404 returned error can't find the container with id ab0b6f83108d3fbb2861facc99359d5caca627d70c7d427e674a1cc0e5dcfdbd Jan 29 03:40:29 crc kubenswrapper[4707]: I0129 03:40:29.883408 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" event={"ID":"126a1c1c-acae-407d-854d-fbeb74a88a9c","Type":"ContainerStarted","Data":"ab0b6f83108d3fbb2861facc99359d5caca627d70c7d427e674a1cc0e5dcfdbd"} Jan 29 03:40:32 crc kubenswrapper[4707]: I0129 03:40:32.912747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" event={"ID":"403318b7-a0b4-4a62-8094-9a2ac1127387","Type":"ContainerStarted","Data":"5fd9496f4ca75e288d03ca16084e43f046ab157bdb74bd646e12ca2d42c885e6"} Jan 29 03:40:32 crc kubenswrapper[4707]: I0129 03:40:32.914495 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:40:33 crc kubenswrapper[4707]: I0129 03:40:33.526829 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:33 crc kubenswrapper[4707]: I0129 03:40:33.551212 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" podStartSLOduration=3.5046505249999997 podStartE2EDuration="6.551182037s" podCreationTimestamp="2026-01-29 03:40:27 +0000 UTC" firstStartedPulling="2026-01-29 03:40:28.708232873 +0000 UTC m=+782.192461778" lastFinishedPulling="2026-01-29 03:40:31.754764385 +0000 UTC m=+785.238993290" observedRunningTime="2026-01-29 03:40:32.937250944 +0000 UTC m=+786.421479849" watchObservedRunningTime="2026-01-29 03:40:33.551182037 +0000 UTC m=+787.035410942" Jan 29 03:40:33 crc kubenswrapper[4707]: I0129 03:40:33.574656 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:34 crc kubenswrapper[4707]: I0129 03:40:34.927952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" event={"ID":"126a1c1c-acae-407d-854d-fbeb74a88a9c","Type":"ContainerStarted","Data":"859c5393453c2abdb0404e9dc409646148f84e0f50fc7a805fa95b21442052b2"} Jan 29 03:40:34 crc kubenswrapper[4707]: I0129 03:40:34.956379 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" podStartSLOduration=2.346466157 podStartE2EDuration="6.956353657s" podCreationTimestamp="2026-01-29 03:40:28 +0000 UTC" firstStartedPulling="2026-01-29 03:40:29.279879277 +0000 UTC m=+782.764108182" lastFinishedPulling="2026-01-29 03:40:33.889766777 +0000 UTC m=+787.373995682" observedRunningTime="2026-01-29 03:40:34.951479849 +0000 UTC m=+788.435708764" watchObservedRunningTime="2026-01-29 03:40:34.956353657 +0000 UTC m=+788.440582582" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.291712 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xh56"] Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.292041 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4xh56" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerName="registry-server" containerID="cri-o://b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299" gracePeriod=2 Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.660679 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.736357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-catalog-content\") pod \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.736419 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-utilities\") pod \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.736464 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn7wc\" (UniqueName: \"kubernetes.io/projected/e5b8c571-1def-4784-8bc0-e52b484a6c1f-kube-api-access-kn7wc\") pod \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\" (UID: \"e5b8c571-1def-4784-8bc0-e52b484a6c1f\") " Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.737911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-utilities" (OuterVolumeSpecName: "utilities") pod "e5b8c571-1def-4784-8bc0-e52b484a6c1f" (UID: "e5b8c571-1def-4784-8bc0-e52b484a6c1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.749852 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b8c571-1def-4784-8bc0-e52b484a6c1f-kube-api-access-kn7wc" (OuterVolumeSpecName: "kube-api-access-kn7wc") pod "e5b8c571-1def-4784-8bc0-e52b484a6c1f" (UID: "e5b8c571-1def-4784-8bc0-e52b484a6c1f"). InnerVolumeSpecName "kube-api-access-kn7wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.837876 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.837922 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn7wc\" (UniqueName: \"kubernetes.io/projected/e5b8c571-1def-4784-8bc0-e52b484a6c1f-kube-api-access-kn7wc\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.854333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5b8c571-1def-4784-8bc0-e52b484a6c1f" (UID: "e5b8c571-1def-4784-8bc0-e52b484a6c1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.936589 4707 generic.go:334] "Generic (PLEG): container finished" podID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerID="b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299" exitCode=0 Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.936673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xh56" event={"ID":"e5b8c571-1def-4784-8bc0-e52b484a6c1f","Type":"ContainerDied","Data":"b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299"} Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.936718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xh56" event={"ID":"e5b8c571-1def-4784-8bc0-e52b484a6c1f","Type":"ContainerDied","Data":"a6bca5ae06d81a3581b3c1cf2b07c3dc58ef290ccb426b7527b618de179870dc"} Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.936742 4707 scope.go:117] "RemoveContainer" containerID="b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.936747 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xh56" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.937454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.939567 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b8c571-1def-4784-8bc0-e52b484a6c1f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.955704 4707 scope.go:117] "RemoveContainer" containerID="30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.976737 4707 scope.go:117] "RemoveContainer" containerID="e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631" Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.995100 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xh56"] Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.999492 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4xh56"] Jan 29 03:40:35 crc kubenswrapper[4707]: I0129 03:40:35.999876 4707 scope.go:117] "RemoveContainer" containerID="b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299" Jan 29 03:40:36 crc kubenswrapper[4707]: E0129 03:40:36.000472 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299\": container with ID starting with b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299 not found: ID does not exist" containerID="b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299" Jan 29 03:40:36 crc kubenswrapper[4707]: I0129 03:40:36.000586 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299"} err="failed to get container status \"b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299\": rpc error: code = NotFound desc = could not find container \"b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299\": container with ID starting with b2af6800d53d93ec01990ba89b0e6d2c39592a9a7a11768b8faf5e064567b299 not found: ID does not exist" Jan 29 03:40:36 crc kubenswrapper[4707]: I0129 03:40:36.000694 4707 scope.go:117] "RemoveContainer" containerID="30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e" Jan 29 03:40:36 crc kubenswrapper[4707]: E0129 03:40:36.001944 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e\": container with ID starting with 30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e not found: ID does not exist" containerID="30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e" Jan 29 03:40:36 crc kubenswrapper[4707]: I0129 03:40:36.001986 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e"} err="failed to get container status \"30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e\": rpc error: code = NotFound desc = could not find container \"30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e\": container with ID starting with 30a14fd926d10d38ee63b208dd3e828b57e276d02d2155e7286e3006c00fe59e not found: ID does not exist" Jan 29 03:40:36 crc kubenswrapper[4707]: I0129 03:40:36.002014 4707 scope.go:117] "RemoveContainer" containerID="e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631" Jan 29 03:40:36 crc kubenswrapper[4707]: E0129 03:40:36.002472 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631\": container with ID starting with e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631 not found: ID does not exist" containerID="e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631" Jan 29 03:40:36 crc kubenswrapper[4707]: I0129 03:40:36.002494 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631"} err="failed to get container status \"e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631\": rpc error: code = NotFound desc = could not find container \"e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631\": container with ID starting with e86819dc0efd9a4bb1d984557865d81245df1ee69c47dc5df843c5225a470631 not found: ID does not exist" Jan 29 03:40:37 crc kubenswrapper[4707]: I0129 03:40:37.252630 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" path="/var/lib/kubelet/pods/e5b8c571-1def-4784-8bc0-e52b484a6c1f/volumes" Jan 29 03:40:49 crc kubenswrapper[4707]: I0129 03:40:49.052570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-866946c6ff-xnqlj" Jan 29 03:41:04 crc kubenswrapper[4707]: I0129 03:41:04.900612 4707 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.239631 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-76498b594b-4z2xj" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.955045 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-b72sz"] Jan 29 03:41:08 crc kubenswrapper[4707]: E0129 03:41:08.955384 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerName="extract-utilities" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.955406 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerName="extract-utilities" Jan 29 03:41:08 crc kubenswrapper[4707]: E0129 03:41:08.955425 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerName="extract-content" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.955435 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerName="extract-content" Jan 29 03:41:08 crc kubenswrapper[4707]: E0129 03:41:08.955482 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerName="registry-server" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.955491 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerName="registry-server" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.955646 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b8c571-1def-4784-8bc0-e52b484a6c1f" containerName="registry-server" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.958272 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.961696 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.962767 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9fzdd" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.967839 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.971266 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc"] Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.972424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.974064 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 29 03:41:08 crc kubenswrapper[4707]: I0129 03:41:08.980561 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc"] Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.027372 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjcg4\" (UniqueName: \"kubernetes.io/projected/a43d3509-ef8d-47ba-b60f-675e3113086d-kube-api-access-fjcg4\") pod \"frr-k8s-webhook-server-7df86c4f6c-4vtxc\" (UID: \"a43d3509-ef8d-47ba-b60f-675e3113086d\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.027482 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-reloader\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.027526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-frr-conf\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.027558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-frr-sockets\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.027641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a43d3509-ef8d-47ba-b60f-675e3113086d-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-4vtxc\" (UID: \"a43d3509-ef8d-47ba-b60f-675e3113086d\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.027674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a50001d5-1baf-4746-aa06-afa2a7853541-frr-startup\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.027820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a50001d5-1baf-4746-aa06-afa2a7853541-metrics-certs\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.027868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scjkr\" (UniqueName: \"kubernetes.io/projected/a50001d5-1baf-4746-aa06-afa2a7853541-kube-api-access-scjkr\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.027906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-metrics\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.041133 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2frrd"] Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.042469 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.048503 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.048792 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.049190 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-qq6nr" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.050820 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.068022 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-r9vjq"] Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.069291 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.070949 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.080184 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-r9vjq"] Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/063768b8-90c6-4b82-b3d2-13fbdc42bab5-metrics-certs\") pod \"controller-6968d8fdc4-r9vjq\" (UID: \"063768b8-90c6-4b82-b3d2-13fbdc42bab5\") " pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128381 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjcg4\" (UniqueName: \"kubernetes.io/projected/a43d3509-ef8d-47ba-b60f-675e3113086d-kube-api-access-fjcg4\") pod \"frr-k8s-webhook-server-7df86c4f6c-4vtxc\" (UID: \"a43d3509-ef8d-47ba-b60f-675e3113086d\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/063768b8-90c6-4b82-b3d2-13fbdc42bab5-cert\") pod \"controller-6968d8fdc4-r9vjq\" (UID: \"063768b8-90c6-4b82-b3d2-13fbdc42bab5\") " pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-metrics-certs\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-reloader\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-frr-conf\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-frr-sockets\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-memberlist\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a43d3509-ef8d-47ba-b60f-675e3113086d-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-4vtxc\" (UID: \"a43d3509-ef8d-47ba-b60f-675e3113086d\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a50001d5-1baf-4746-aa06-afa2a7853541-frr-startup\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a50001d5-1baf-4746-aa06-afa2a7853541-metrics-certs\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scjkr\" (UniqueName: \"kubernetes.io/projected/a50001d5-1baf-4746-aa06-afa2a7853541-kube-api-access-scjkr\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128688 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-metrics\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffc77\" (UniqueName: \"kubernetes.io/projected/063768b8-90c6-4b82-b3d2-13fbdc42bab5-kube-api-access-ffc77\") pod \"controller-6968d8fdc4-r9vjq\" (UID: \"063768b8-90c6-4b82-b3d2-13fbdc42bab5\") " pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1d738b61-6875-468d-8fdb-c0d567c8ea88-metallb-excludel2\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.128782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xmrp\" (UniqueName: \"kubernetes.io/projected/1d738b61-6875-468d-8fdb-c0d567c8ea88-kube-api-access-6xmrp\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.129770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-reloader\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: E0129 03:41:09.129880 4707 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 29 03:41:09 crc kubenswrapper[4707]: E0129 03:41:09.129980 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a43d3509-ef8d-47ba-b60f-675e3113086d-cert podName:a43d3509-ef8d-47ba-b60f-675e3113086d nodeName:}" failed. No retries permitted until 2026-01-29 03:41:09.62995188 +0000 UTC m=+823.114180865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a43d3509-ef8d-47ba-b60f-675e3113086d-cert") pod "frr-k8s-webhook-server-7df86c4f6c-4vtxc" (UID: "a43d3509-ef8d-47ba-b60f-675e3113086d") : secret "frr-k8s-webhook-server-cert" not found Jan 29 03:41:09 crc kubenswrapper[4707]: E0129 03:41:09.129880 4707 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 29 03:41:09 crc kubenswrapper[4707]: E0129 03:41:09.130102 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a50001d5-1baf-4746-aa06-afa2a7853541-metrics-certs podName:a50001d5-1baf-4746-aa06-afa2a7853541 nodeName:}" failed. No retries permitted until 2026-01-29 03:41:09.630074343 +0000 UTC m=+823.114303318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a50001d5-1baf-4746-aa06-afa2a7853541-metrics-certs") pod "frr-k8s-b72sz" (UID: "a50001d5-1baf-4746-aa06-afa2a7853541") : secret "frr-k8s-certs-secret" not found Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.130582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-frr-conf\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.130597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-metrics\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.130684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a50001d5-1baf-4746-aa06-afa2a7853541-frr-startup\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.130698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a50001d5-1baf-4746-aa06-afa2a7853541-frr-sockets\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.156166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scjkr\" (UniqueName: \"kubernetes.io/projected/a50001d5-1baf-4746-aa06-afa2a7853541-kube-api-access-scjkr\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.163030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjcg4\" (UniqueName: \"kubernetes.io/projected/a43d3509-ef8d-47ba-b60f-675e3113086d-kube-api-access-fjcg4\") pod \"frr-k8s-webhook-server-7df86c4f6c-4vtxc\" (UID: \"a43d3509-ef8d-47ba-b60f-675e3113086d\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.229806 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffc77\" (UniqueName: \"kubernetes.io/projected/063768b8-90c6-4b82-b3d2-13fbdc42bab5-kube-api-access-ffc77\") pod \"controller-6968d8fdc4-r9vjq\" (UID: \"063768b8-90c6-4b82-b3d2-13fbdc42bab5\") " pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.229859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1d738b61-6875-468d-8fdb-c0d567c8ea88-metallb-excludel2\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.229899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xmrp\" (UniqueName: \"kubernetes.io/projected/1d738b61-6875-468d-8fdb-c0d567c8ea88-kube-api-access-6xmrp\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.229921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/063768b8-90c6-4b82-b3d2-13fbdc42bab5-metrics-certs\") pod \"controller-6968d8fdc4-r9vjq\" (UID: \"063768b8-90c6-4b82-b3d2-13fbdc42bab5\") " pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.229943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-metrics-certs\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.229959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/063768b8-90c6-4b82-b3d2-13fbdc42bab5-cert\") pod \"controller-6968d8fdc4-r9vjq\" (UID: \"063768b8-90c6-4b82-b3d2-13fbdc42bab5\") " pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.229990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-memberlist\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: E0129 03:41:09.230079 4707 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 29 03:41:09 crc kubenswrapper[4707]: E0129 03:41:09.230132 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-memberlist podName:1d738b61-6875-468d-8fdb-c0d567c8ea88 nodeName:}" failed. No retries permitted until 2026-01-29 03:41:09.730112246 +0000 UTC m=+823.214341151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-memberlist") pod "speaker-2frrd" (UID: "1d738b61-6875-468d-8fdb-c0d567c8ea88") : secret "metallb-memberlist" not found Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.230844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1d738b61-6875-468d-8fdb-c0d567c8ea88-metallb-excludel2\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.234180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/063768b8-90c6-4b82-b3d2-13fbdc42bab5-cert\") pod \"controller-6968d8fdc4-r9vjq\" (UID: \"063768b8-90c6-4b82-b3d2-13fbdc42bab5\") " pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.235024 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/063768b8-90c6-4b82-b3d2-13fbdc42bab5-metrics-certs\") pod \"controller-6968d8fdc4-r9vjq\" (UID: \"063768b8-90c6-4b82-b3d2-13fbdc42bab5\") " pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.235077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-metrics-certs\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.251199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffc77\" (UniqueName: \"kubernetes.io/projected/063768b8-90c6-4b82-b3d2-13fbdc42bab5-kube-api-access-ffc77\") pod \"controller-6968d8fdc4-r9vjq\" (UID: \"063768b8-90c6-4b82-b3d2-13fbdc42bab5\") " pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.251337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xmrp\" (UniqueName: \"kubernetes.io/projected/1d738b61-6875-468d-8fdb-c0d567c8ea88-kube-api-access-6xmrp\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.386433 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.635443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a43d3509-ef8d-47ba-b60f-675e3113086d-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-4vtxc\" (UID: \"a43d3509-ef8d-47ba-b60f-675e3113086d\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.635943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a50001d5-1baf-4746-aa06-afa2a7853541-metrics-certs\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.640613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a43d3509-ef8d-47ba-b60f-675e3113086d-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-4vtxc\" (UID: \"a43d3509-ef8d-47ba-b60f-675e3113086d\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.640656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a50001d5-1baf-4746-aa06-afa2a7853541-metrics-certs\") pod \"frr-k8s-b72sz\" (UID: \"a50001d5-1baf-4746-aa06-afa2a7853541\") " pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.736746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-memberlist\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:09 crc kubenswrapper[4707]: E0129 03:41:09.736966 4707 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 29 03:41:09 crc kubenswrapper[4707]: E0129 03:41:09.737041 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-memberlist podName:1d738b61-6875-468d-8fdb-c0d567c8ea88 nodeName:}" failed. No retries permitted until 2026-01-29 03:41:10.73702127 +0000 UTC m=+824.221250175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-memberlist") pod "speaker-2frrd" (UID: "1d738b61-6875-468d-8fdb-c0d567c8ea88") : secret "metallb-memberlist" not found Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.856530 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-r9vjq"] Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.884397 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:09 crc kubenswrapper[4707]: I0129 03:41:09.895456 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:10 crc kubenswrapper[4707]: I0129 03:41:10.193080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerStarted","Data":"38edc7a47bec1eb5a9ad51360c378f30efdb108bfaaf12045c49ff56b54e2748"} Jan 29 03:41:10 crc kubenswrapper[4707]: I0129 03:41:10.197480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-r9vjq" event={"ID":"063768b8-90c6-4b82-b3d2-13fbdc42bab5","Type":"ContainerStarted","Data":"bbaf11f9a03b26a14f4ab76ec8f48442e452b384740146b809c1a2442a614e21"} Jan 29 03:41:10 crc kubenswrapper[4707]: I0129 03:41:10.197550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-r9vjq" event={"ID":"063768b8-90c6-4b82-b3d2-13fbdc42bab5","Type":"ContainerStarted","Data":"62daaa3afe5e6e42b6aa98e20f0ad1663b2a058051bc433a0add2f8c34c3be50"} Jan 29 03:41:10 crc kubenswrapper[4707]: I0129 03:41:10.198204 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc"] Jan 29 03:41:10 crc kubenswrapper[4707]: I0129 03:41:10.764468 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-memberlist\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:10 crc kubenswrapper[4707]: I0129 03:41:10.774445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d738b61-6875-468d-8fdb-c0d567c8ea88-memberlist\") pod \"speaker-2frrd\" (UID: \"1d738b61-6875-468d-8fdb-c0d567c8ea88\") " pod="metallb-system/speaker-2frrd" Jan 29 03:41:10 crc kubenswrapper[4707]: I0129 03:41:10.858566 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2frrd" Jan 29 03:41:10 crc kubenswrapper[4707]: W0129 03:41:10.880592 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d738b61_6875_468d_8fdb_c0d567c8ea88.slice/crio-0f35e163ddb0c2c694d49efb080be264dca4fb9f2ef2eb91811e1ded7b2efdda WatchSource:0}: Error finding container 0f35e163ddb0c2c694d49efb080be264dca4fb9f2ef2eb91811e1ded7b2efdda: Status 404 returned error can't find the container with id 0f35e163ddb0c2c694d49efb080be264dca4fb9f2ef2eb91811e1ded7b2efdda Jan 29 03:41:11 crc kubenswrapper[4707]: I0129 03:41:11.210696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" event={"ID":"a43d3509-ef8d-47ba-b60f-675e3113086d","Type":"ContainerStarted","Data":"c4f02be23328000ab053b9c1e932c15fc9ed4d5895f33f8b92515bb912ba9536"} Jan 29 03:41:11 crc kubenswrapper[4707]: I0129 03:41:11.216367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-r9vjq" event={"ID":"063768b8-90c6-4b82-b3d2-13fbdc42bab5","Type":"ContainerStarted","Data":"bb36bac7d06195d856762bb53d226f31876ac87d7597a1bc7a88a335ad44e9b0"} Jan 29 03:41:11 crc kubenswrapper[4707]: I0129 03:41:11.216669 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:11 crc kubenswrapper[4707]: I0129 03:41:11.218282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2frrd" event={"ID":"1d738b61-6875-468d-8fdb-c0d567c8ea88","Type":"ContainerStarted","Data":"6f8c1b67cc739463338162e039d1b23e2065c53a3e4f26df96953bc44636d6dd"} Jan 29 03:41:11 crc kubenswrapper[4707]: I0129 03:41:11.218320 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2frrd" event={"ID":"1d738b61-6875-468d-8fdb-c0d567c8ea88","Type":"ContainerStarted","Data":"0f35e163ddb0c2c694d49efb080be264dca4fb9f2ef2eb91811e1ded7b2efdda"} Jan 29 03:41:11 crc kubenswrapper[4707]: I0129 03:41:11.245501 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-r9vjq" podStartSLOduration=2.245472084 podStartE2EDuration="2.245472084s" podCreationTimestamp="2026-01-29 03:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:41:11.241729539 +0000 UTC m=+824.725958454" watchObservedRunningTime="2026-01-29 03:41:11.245472084 +0000 UTC m=+824.729700989" Jan 29 03:41:12 crc kubenswrapper[4707]: I0129 03:41:12.229611 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2frrd" event={"ID":"1d738b61-6875-468d-8fdb-c0d567c8ea88","Type":"ContainerStarted","Data":"a366c20ed44c8d41ed44d9549815aea878b98ba8b538451a5d8e5d94abf5c67a"} Jan 29 03:41:12 crc kubenswrapper[4707]: I0129 03:41:12.268780 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2frrd" podStartSLOduration=3.268754237 podStartE2EDuration="3.268754237s" podCreationTimestamp="2026-01-29 03:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:41:12.26282785 +0000 UTC m=+825.747056755" watchObservedRunningTime="2026-01-29 03:41:12.268754237 +0000 UTC m=+825.752983142" Jan 29 03:41:13 crc kubenswrapper[4707]: I0129 03:41:13.235373 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2frrd" Jan 29 03:41:18 crc kubenswrapper[4707]: I0129 03:41:18.276784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" event={"ID":"a43d3509-ef8d-47ba-b60f-675e3113086d","Type":"ContainerStarted","Data":"f6a0b41852632e73074e5193fcbf2e3216930decbbd8fb4da91863822a3f1399"} Jan 29 03:41:18 crc kubenswrapper[4707]: I0129 03:41:18.277357 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:18 crc kubenswrapper[4707]: I0129 03:41:18.278665 4707 generic.go:334] "Generic (PLEG): container finished" podID="a50001d5-1baf-4746-aa06-afa2a7853541" containerID="150c02df86d3c0d67fee668acc01737ff80e69f66d131055b36c0b97b5db66cb" exitCode=0 Jan 29 03:41:18 crc kubenswrapper[4707]: I0129 03:41:18.278692 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerDied","Data":"150c02df86d3c0d67fee668acc01737ff80e69f66d131055b36c0b97b5db66cb"} Jan 29 03:41:18 crc kubenswrapper[4707]: I0129 03:41:18.304311 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" podStartSLOduration=3.343543091 podStartE2EDuration="10.304286233s" podCreationTimestamp="2026-01-29 03:41:08 +0000 UTC" firstStartedPulling="2026-01-29 03:41:10.222133589 +0000 UTC m=+823.706362494" lastFinishedPulling="2026-01-29 03:41:17.182876731 +0000 UTC m=+830.667105636" observedRunningTime="2026-01-29 03:41:18.300314742 +0000 UTC m=+831.784543647" watchObservedRunningTime="2026-01-29 03:41:18.304286233 +0000 UTC m=+831.788515138" Jan 29 03:41:19 crc kubenswrapper[4707]: I0129 03:41:19.300745 4707 generic.go:334] "Generic (PLEG): container finished" podID="a50001d5-1baf-4746-aa06-afa2a7853541" containerID="35b37dd284b69726e594d8960d870c8016127ce627be6064cea1f53190bd319c" exitCode=0 Jan 29 03:41:19 crc kubenswrapper[4707]: I0129 03:41:19.301745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerDied","Data":"35b37dd284b69726e594d8960d870c8016127ce627be6064cea1f53190bd319c"} Jan 29 03:41:20 crc kubenswrapper[4707]: I0129 03:41:20.310665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerDied","Data":"aeb672c52cb4210ebca077181fc92f369f665f941d0bde8266e8a4541499b047"} Jan 29 03:41:20 crc kubenswrapper[4707]: I0129 03:41:20.310658 4707 generic.go:334] "Generic (PLEG): container finished" podID="a50001d5-1baf-4746-aa06-afa2a7853541" containerID="aeb672c52cb4210ebca077181fc92f369f665f941d0bde8266e8a4541499b047" exitCode=0 Jan 29 03:41:21 crc kubenswrapper[4707]: I0129 03:41:21.328275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerStarted","Data":"03fb999e0f546886534ee4d2ef5f0232f526a2810c50c7990226adf0f235e63e"} Jan 29 03:41:21 crc kubenswrapper[4707]: I0129 03:41:21.328596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerStarted","Data":"48a62e941e081345af80e3739e2739c00c80f3e62975de2dc73723782771ec5d"} Jan 29 03:41:21 crc kubenswrapper[4707]: I0129 03:41:21.328607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerStarted","Data":"070b427d2ab43e8351b1717d631fac9dbcf5b1f1b2f4f09516002644d8216e5b"} Jan 29 03:41:21 crc kubenswrapper[4707]: I0129 03:41:21.328616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerStarted","Data":"eeec881cd87c6510f68726b059e9de14d50a05986ca81ac3f03145384c61a8bf"} Jan 29 03:41:22 crc kubenswrapper[4707]: I0129 03:41:22.346437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerStarted","Data":"14d55f52c7502b16a8245859561b7ef80b9ae8d74b11b4b2fd268f1ef06f05e8"} Jan 29 03:41:22 crc kubenswrapper[4707]: I0129 03:41:22.346495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b72sz" event={"ID":"a50001d5-1baf-4746-aa06-afa2a7853541","Type":"ContainerStarted","Data":"84e92336ed17fd950dae1c78bf782b1b3a254d6fed78d0fbdf0125bd9c7861a1"} Jan 29 03:41:22 crc kubenswrapper[4707]: I0129 03:41:22.373757 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-b72sz" podStartSLOduration=7.184044099 podStartE2EDuration="14.373734618s" podCreationTimestamp="2026-01-29 03:41:08 +0000 UTC" firstStartedPulling="2026-01-29 03:41:10.001152576 +0000 UTC m=+823.485381491" lastFinishedPulling="2026-01-29 03:41:17.190843105 +0000 UTC m=+830.675072010" observedRunningTime="2026-01-29 03:41:22.368282785 +0000 UTC m=+835.852511710" watchObservedRunningTime="2026-01-29 03:41:22.373734618 +0000 UTC m=+835.857963523" Jan 29 03:41:23 crc kubenswrapper[4707]: I0129 03:41:23.354310 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:24 crc kubenswrapper[4707]: I0129 03:41:24.885661 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:24 crc kubenswrapper[4707]: I0129 03:41:24.960183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:29 crc kubenswrapper[4707]: I0129 03:41:29.393252 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-r9vjq" Jan 29 03:41:29 crc kubenswrapper[4707]: I0129 03:41:29.905932 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4vtxc" Jan 29 03:41:30 crc kubenswrapper[4707]: I0129 03:41:30.862172 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2frrd" Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.625805 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rsxrp"] Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.637374 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rsxrp" Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.640929 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.641391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-mt2zr" Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.643466 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.651215 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rsxrp"] Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.736914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6m2l\" (UniqueName: \"kubernetes.io/projected/025389d3-5111-4dad-9e1a-6562f7aaf627-kube-api-access-g6m2l\") pod \"openstack-operator-index-rsxrp\" (UID: \"025389d3-5111-4dad-9e1a-6562f7aaf627\") " pod="openstack-operators/openstack-operator-index-rsxrp" Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.839190 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6m2l\" (UniqueName: \"kubernetes.io/projected/025389d3-5111-4dad-9e1a-6562f7aaf627-kube-api-access-g6m2l\") pod \"openstack-operator-index-rsxrp\" (UID: \"025389d3-5111-4dad-9e1a-6562f7aaf627\") " pod="openstack-operators/openstack-operator-index-rsxrp" Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.862086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6m2l\" (UniqueName: \"kubernetes.io/projected/025389d3-5111-4dad-9e1a-6562f7aaf627-kube-api-access-g6m2l\") pod \"openstack-operator-index-rsxrp\" (UID: \"025389d3-5111-4dad-9e1a-6562f7aaf627\") " pod="openstack-operators/openstack-operator-index-rsxrp" Jan 29 03:41:33 crc kubenswrapper[4707]: I0129 03:41:33.963815 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rsxrp" Jan 29 03:41:34 crc kubenswrapper[4707]: I0129 03:41:34.387992 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rsxrp"] Jan 29 03:41:34 crc kubenswrapper[4707]: I0129 03:41:34.430229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rsxrp" event={"ID":"025389d3-5111-4dad-9e1a-6562f7aaf627","Type":"ContainerStarted","Data":"4aec39ac11963279ae18ec8813009195664218924fdd830ee886f465bc3b7f44"} Jan 29 03:41:36 crc kubenswrapper[4707]: I0129 03:41:36.781990 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rsxrp"] Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.391452 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-pnmmg"] Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.396221 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pnmmg" Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.402661 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pnmmg"] Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.453852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rsxrp" event={"ID":"025389d3-5111-4dad-9e1a-6562f7aaf627","Type":"ContainerStarted","Data":"38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd"} Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.454045 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rsxrp" podUID="025389d3-5111-4dad-9e1a-6562f7aaf627" containerName="registry-server" containerID="cri-o://38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd" gracePeriod=2 Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.478477 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rsxrp" podStartSLOduration=2.167766756 podStartE2EDuration="4.478449239s" podCreationTimestamp="2026-01-29 03:41:33 +0000 UTC" firstStartedPulling="2026-01-29 03:41:34.391529133 +0000 UTC m=+847.875758038" lastFinishedPulling="2026-01-29 03:41:36.702211616 +0000 UTC m=+850.186440521" observedRunningTime="2026-01-29 03:41:37.474675174 +0000 UTC m=+850.958904099" watchObservedRunningTime="2026-01-29 03:41:37.478449239 +0000 UTC m=+850.962678144" Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.494824 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mbxq\" (UniqueName: \"kubernetes.io/projected/2553fa13-b0b4-45c7-9317-f6be21e7c1f0-kube-api-access-7mbxq\") pod \"openstack-operator-index-pnmmg\" (UID: \"2553fa13-b0b4-45c7-9317-f6be21e7c1f0\") " pod="openstack-operators/openstack-operator-index-pnmmg" Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.597424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mbxq\" (UniqueName: \"kubernetes.io/projected/2553fa13-b0b4-45c7-9317-f6be21e7c1f0-kube-api-access-7mbxq\") pod \"openstack-operator-index-pnmmg\" (UID: \"2553fa13-b0b4-45c7-9317-f6be21e7c1f0\") " pod="openstack-operators/openstack-operator-index-pnmmg" Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.621036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mbxq\" (UniqueName: \"kubernetes.io/projected/2553fa13-b0b4-45c7-9317-f6be21e7c1f0-kube-api-access-7mbxq\") pod \"openstack-operator-index-pnmmg\" (UID: \"2553fa13-b0b4-45c7-9317-f6be21e7c1f0\") " pod="openstack-operators/openstack-operator-index-pnmmg" Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.720905 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pnmmg" Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.873863 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rsxrp" Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.902373 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6m2l\" (UniqueName: \"kubernetes.io/projected/025389d3-5111-4dad-9e1a-6562f7aaf627-kube-api-access-g6m2l\") pod \"025389d3-5111-4dad-9e1a-6562f7aaf627\" (UID: \"025389d3-5111-4dad-9e1a-6562f7aaf627\") " Jan 29 03:41:37 crc kubenswrapper[4707]: I0129 03:41:37.907696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025389d3-5111-4dad-9e1a-6562f7aaf627-kube-api-access-g6m2l" (OuterVolumeSpecName: "kube-api-access-g6m2l") pod "025389d3-5111-4dad-9e1a-6562f7aaf627" (UID: "025389d3-5111-4dad-9e1a-6562f7aaf627"). InnerVolumeSpecName "kube-api-access-g6m2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.004940 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6m2l\" (UniqueName: \"kubernetes.io/projected/025389d3-5111-4dad-9e1a-6562f7aaf627-kube-api-access-g6m2l\") on node \"crc\" DevicePath \"\"" Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.214304 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pnmmg"] Jan 29 03:41:38 crc kubenswrapper[4707]: W0129 03:41:38.228125 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2553fa13_b0b4_45c7_9317_f6be21e7c1f0.slice/crio-1ba8c785e55c3a76fc12edab0eab5293ecaf33c7e1e0cbb8c5dcb28d81775753 WatchSource:0}: Error finding container 1ba8c785e55c3a76fc12edab0eab5293ecaf33c7e1e0cbb8c5dcb28d81775753: Status 404 returned error can't find the container with id 1ba8c785e55c3a76fc12edab0eab5293ecaf33c7e1e0cbb8c5dcb28d81775753 Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.462802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pnmmg" event={"ID":"2553fa13-b0b4-45c7-9317-f6be21e7c1f0","Type":"ContainerStarted","Data":"d2f6ec2c2e8a7dc664baff73c2208969711fad3ff9a91dfecfc70443a82a87d6"} Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.462872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pnmmg" event={"ID":"2553fa13-b0b4-45c7-9317-f6be21e7c1f0","Type":"ContainerStarted","Data":"1ba8c785e55c3a76fc12edab0eab5293ecaf33c7e1e0cbb8c5dcb28d81775753"} Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.465986 4707 generic.go:334] "Generic (PLEG): container finished" podID="025389d3-5111-4dad-9e1a-6562f7aaf627" containerID="38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd" exitCode=0 Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.466031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rsxrp" event={"ID":"025389d3-5111-4dad-9e1a-6562f7aaf627","Type":"ContainerDied","Data":"38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd"} Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.466053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rsxrp" event={"ID":"025389d3-5111-4dad-9e1a-6562f7aaf627","Type":"ContainerDied","Data":"4aec39ac11963279ae18ec8813009195664218924fdd830ee886f465bc3b7f44"} Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.466079 4707 scope.go:117] "RemoveContainer" containerID="38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd" Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.466207 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rsxrp" Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.486482 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-pnmmg" podStartSLOduration=1.418403054 podStartE2EDuration="1.486451745s" podCreationTimestamp="2026-01-29 03:41:37 +0000 UTC" firstStartedPulling="2026-01-29 03:41:38.235156376 +0000 UTC m=+851.719385331" lastFinishedPulling="2026-01-29 03:41:38.303205097 +0000 UTC m=+851.787434022" observedRunningTime="2026-01-29 03:41:38.4812533 +0000 UTC m=+851.965482215" watchObservedRunningTime="2026-01-29 03:41:38.486451745 +0000 UTC m=+851.970680650" Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.494294 4707 scope.go:117] "RemoveContainer" containerID="38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd" Jan 29 03:41:38 crc kubenswrapper[4707]: E0129 03:41:38.495008 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd\": container with ID starting with 38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd not found: ID does not exist" containerID="38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd" Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.495038 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd"} err="failed to get container status \"38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd\": rpc error: code = NotFound desc = could not find container \"38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd\": container with ID starting with 38da38cc3764437fa16c59bb95a082147d41a9e8fcd16731b8a72ffc679470dd not found: ID does not exist" Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.505868 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rsxrp"] Jan 29 03:41:38 crc kubenswrapper[4707]: I0129 03:41:38.509912 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rsxrp"] Jan 29 03:41:39 crc kubenswrapper[4707]: I0129 03:41:39.259279 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025389d3-5111-4dad-9e1a-6562f7aaf627" path="/var/lib/kubelet/pods/025389d3-5111-4dad-9e1a-6562f7aaf627/volumes" Jan 29 03:41:39 crc kubenswrapper[4707]: I0129 03:41:39.888868 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-b72sz" Jan 29 03:41:47 crc kubenswrapper[4707]: I0129 03:41:47.721072 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-pnmmg" Jan 29 03:41:47 crc kubenswrapper[4707]: I0129 03:41:47.721859 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-pnmmg" Jan 29 03:41:47 crc kubenswrapper[4707]: I0129 03:41:47.746112 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-pnmmg" Jan 29 03:41:48 crc kubenswrapper[4707]: I0129 03:41:48.564746 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-pnmmg" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.410050 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v"] Jan 29 03:41:55 crc kubenswrapper[4707]: E0129 03:41:55.411640 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025389d3-5111-4dad-9e1a-6562f7aaf627" containerName="registry-server" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.411668 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="025389d3-5111-4dad-9e1a-6562f7aaf627" containerName="registry-server" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.411957 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="025389d3-5111-4dad-9e1a-6562f7aaf627" containerName="registry-server" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.413567 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.415591 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-rbg7h" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.415794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v"] Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.584026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-util\") pod \"8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.584087 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-bundle\") pod \"8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.584128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbmkt\" (UniqueName: \"kubernetes.io/projected/882475a0-6529-4596-9104-2c7ec1c2e414-kube-api-access-vbmkt\") pod \"8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.686498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-util\") pod \"8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.686570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-bundle\") pod \"8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.686596 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbmkt\" (UniqueName: \"kubernetes.io/projected/882475a0-6529-4596-9104-2c7ec1c2e414-kube-api-access-vbmkt\") pod \"8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.687295 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-bundle\") pod \"8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.687291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-util\") pod \"8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.713113 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbmkt\" (UniqueName: \"kubernetes.io/projected/882475a0-6529-4596-9104-2c7ec1c2e414-kube-api-access-vbmkt\") pod \"8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:55 crc kubenswrapper[4707]: I0129 03:41:55.741310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:56 crc kubenswrapper[4707]: I0129 03:41:56.005625 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v"] Jan 29 03:41:56 crc kubenswrapper[4707]: W0129 03:41:56.026923 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod882475a0_6529_4596_9104_2c7ec1c2e414.slice/crio-fe1e6c091e8f28f91e6963bb14add035573864b1211e40381260a065487287d5 WatchSource:0}: Error finding container fe1e6c091e8f28f91e6963bb14add035573864b1211e40381260a065487287d5: Status 404 returned error can't find the container with id fe1e6c091e8f28f91e6963bb14add035573864b1211e40381260a065487287d5 Jan 29 03:41:56 crc kubenswrapper[4707]: I0129 03:41:56.602383 4707 generic.go:334] "Generic (PLEG): container finished" podID="882475a0-6529-4596-9104-2c7ec1c2e414" containerID="e2ba1ad45f455dd7467167d42c9b23bd01ff3d0f7631eef9bc54dcef979f7d6c" exitCode=0 Jan 29 03:41:56 crc kubenswrapper[4707]: I0129 03:41:56.602489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" event={"ID":"882475a0-6529-4596-9104-2c7ec1c2e414","Type":"ContainerDied","Data":"e2ba1ad45f455dd7467167d42c9b23bd01ff3d0f7631eef9bc54dcef979f7d6c"} Jan 29 03:41:56 crc kubenswrapper[4707]: I0129 03:41:56.602953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" event={"ID":"882475a0-6529-4596-9104-2c7ec1c2e414","Type":"ContainerStarted","Data":"fe1e6c091e8f28f91e6963bb14add035573864b1211e40381260a065487287d5"} Jan 29 03:41:57 crc kubenswrapper[4707]: I0129 03:41:57.614670 4707 generic.go:334] "Generic (PLEG): container finished" podID="882475a0-6529-4596-9104-2c7ec1c2e414" containerID="ac4bdf71c665810219ed98c8574f0860cba0fb0da9cf07a36fcd00a917a02449" exitCode=0 Jan 29 03:41:57 crc kubenswrapper[4707]: I0129 03:41:57.614793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" event={"ID":"882475a0-6529-4596-9104-2c7ec1c2e414","Type":"ContainerDied","Data":"ac4bdf71c665810219ed98c8574f0860cba0fb0da9cf07a36fcd00a917a02449"} Jan 29 03:41:58 crc kubenswrapper[4707]: I0129 03:41:58.636245 4707 generic.go:334] "Generic (PLEG): container finished" podID="882475a0-6529-4596-9104-2c7ec1c2e414" containerID="9b180db4a10852c45a01825c02745a3f069015a923726b3789d4904c50c37058" exitCode=0 Jan 29 03:41:58 crc kubenswrapper[4707]: I0129 03:41:58.636327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" event={"ID":"882475a0-6529-4596-9104-2c7ec1c2e414","Type":"ContainerDied","Data":"9b180db4a10852c45a01825c02745a3f069015a923726b3789d4904c50c37058"} Jan 29 03:41:59 crc kubenswrapper[4707]: I0129 03:41:59.936427 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:41:59 crc kubenswrapper[4707]: I0129 03:41:59.957406 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-bundle\") pod \"882475a0-6529-4596-9104-2c7ec1c2e414\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " Jan 29 03:41:59 crc kubenswrapper[4707]: I0129 03:41:59.957697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-util\") pod \"882475a0-6529-4596-9104-2c7ec1c2e414\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " Jan 29 03:41:59 crc kubenswrapper[4707]: I0129 03:41:59.957753 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbmkt\" (UniqueName: \"kubernetes.io/projected/882475a0-6529-4596-9104-2c7ec1c2e414-kube-api-access-vbmkt\") pod \"882475a0-6529-4596-9104-2c7ec1c2e414\" (UID: \"882475a0-6529-4596-9104-2c7ec1c2e414\") " Jan 29 03:41:59 crc kubenswrapper[4707]: I0129 03:41:59.958656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-bundle" (OuterVolumeSpecName: "bundle") pod "882475a0-6529-4596-9104-2c7ec1c2e414" (UID: "882475a0-6529-4596-9104-2c7ec1c2e414"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:41:59 crc kubenswrapper[4707]: I0129 03:41:59.972904 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-util" (OuterVolumeSpecName: "util") pod "882475a0-6529-4596-9104-2c7ec1c2e414" (UID: "882475a0-6529-4596-9104-2c7ec1c2e414"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:42:00 crc kubenswrapper[4707]: I0129 03:42:00.019400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882475a0-6529-4596-9104-2c7ec1c2e414-kube-api-access-vbmkt" (OuterVolumeSpecName: "kube-api-access-vbmkt") pod "882475a0-6529-4596-9104-2c7ec1c2e414" (UID: "882475a0-6529-4596-9104-2c7ec1c2e414"). InnerVolumeSpecName "kube-api-access-vbmkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:42:00 crc kubenswrapper[4707]: I0129 03:42:00.059115 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-util\") on node \"crc\" DevicePath \"\"" Jan 29 03:42:00 crc kubenswrapper[4707]: I0129 03:42:00.059155 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbmkt\" (UniqueName: \"kubernetes.io/projected/882475a0-6529-4596-9104-2c7ec1c2e414-kube-api-access-vbmkt\") on node \"crc\" DevicePath \"\"" Jan 29 03:42:00 crc kubenswrapper[4707]: I0129 03:42:00.059167 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/882475a0-6529-4596-9104-2c7ec1c2e414-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:42:00 crc kubenswrapper[4707]: I0129 03:42:00.654995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" event={"ID":"882475a0-6529-4596-9104-2c7ec1c2e414","Type":"ContainerDied","Data":"fe1e6c091e8f28f91e6963bb14add035573864b1211e40381260a065487287d5"} Jan 29 03:42:00 crc kubenswrapper[4707]: I0129 03:42:00.655090 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe1e6c091e8f28f91e6963bb14add035573864b1211e40381260a065487287d5" Jan 29 03:42:00 crc kubenswrapper[4707]: I0129 03:42:00.655090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v" Jan 29 03:42:03 crc kubenswrapper[4707]: I0129 03:42:03.463429 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:42:03 crc kubenswrapper[4707]: I0129 03:42:03.463925 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.455303 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz"] Jan 29 03:42:07 crc kubenswrapper[4707]: E0129 03:42:07.456119 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882475a0-6529-4596-9104-2c7ec1c2e414" containerName="pull" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.456140 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="882475a0-6529-4596-9104-2c7ec1c2e414" containerName="pull" Jan 29 03:42:07 crc kubenswrapper[4707]: E0129 03:42:07.456171 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882475a0-6529-4596-9104-2c7ec1c2e414" containerName="extract" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.456180 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="882475a0-6529-4596-9104-2c7ec1c2e414" containerName="extract" Jan 29 03:42:07 crc kubenswrapper[4707]: E0129 03:42:07.456194 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882475a0-6529-4596-9104-2c7ec1c2e414" containerName="util" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.456202 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="882475a0-6529-4596-9104-2c7ec1c2e414" containerName="util" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.456375 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="882475a0-6529-4596-9104-2c7ec1c2e414" containerName="extract" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.457064 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.461779 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-bhsls" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.477802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cf9j\" (UniqueName: \"kubernetes.io/projected/76e29c5c-b257-48b7-953c-d7db3c6407ed-kube-api-access-2cf9j\") pod \"openstack-operator-controller-init-6955d4df64-rgqtz\" (UID: \"76e29c5c-b257-48b7-953c-d7db3c6407ed\") " pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.479614 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz"] Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.579279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cf9j\" (UniqueName: \"kubernetes.io/projected/76e29c5c-b257-48b7-953c-d7db3c6407ed-kube-api-access-2cf9j\") pod \"openstack-operator-controller-init-6955d4df64-rgqtz\" (UID: \"76e29c5c-b257-48b7-953c-d7db3c6407ed\") " pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.599215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cf9j\" (UniqueName: \"kubernetes.io/projected/76e29c5c-b257-48b7-953c-d7db3c6407ed-kube-api-access-2cf9j\") pod \"openstack-operator-controller-init-6955d4df64-rgqtz\" (UID: \"76e29c5c-b257-48b7-953c-d7db3c6407ed\") " pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" Jan 29 03:42:07 crc kubenswrapper[4707]: I0129 03:42:07.788257 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" Jan 29 03:42:08 crc kubenswrapper[4707]: I0129 03:42:08.004919 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz"] Jan 29 03:42:08 crc kubenswrapper[4707]: I0129 03:42:08.714238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" event={"ID":"76e29c5c-b257-48b7-953c-d7db3c6407ed","Type":"ContainerStarted","Data":"ebbb87a9c32b84cd33f30c491085ffcec7f9b61c39adbdf25ce62bc06ca7de44"} Jan 29 03:42:12 crc kubenswrapper[4707]: I0129 03:42:12.746057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" event={"ID":"76e29c5c-b257-48b7-953c-d7db3c6407ed","Type":"ContainerStarted","Data":"bcbb82a00d69463b2287a0656b3814e2736105417c8c0bf35a21fcb1f447d408"} Jan 29 03:42:12 crc kubenswrapper[4707]: I0129 03:42:12.747929 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" Jan 29 03:42:12 crc kubenswrapper[4707]: I0129 03:42:12.810986 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" podStartSLOduration=1.971007546 podStartE2EDuration="5.810963987s" podCreationTimestamp="2026-01-29 03:42:07 +0000 UTC" firstStartedPulling="2026-01-29 03:42:08.017866132 +0000 UTC m=+881.502095037" lastFinishedPulling="2026-01-29 03:42:11.857822573 +0000 UTC m=+885.342051478" observedRunningTime="2026-01-29 03:42:12.808891449 +0000 UTC m=+886.293120354" watchObservedRunningTime="2026-01-29 03:42:12.810963987 +0000 UTC m=+886.295192892" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.713785 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8g5b7"] Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.715862 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.729801 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8g5b7"] Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.826843 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-utilities\") pod \"redhat-marketplace-8g5b7\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.826915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-catalog-content\") pod \"redhat-marketplace-8g5b7\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.826969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfzht\" (UniqueName: \"kubernetes.io/projected/c9ecee46-1449-46e9-a0d5-de5e171d8b69-kube-api-access-vfzht\") pod \"redhat-marketplace-8g5b7\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.927930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-utilities\") pod \"redhat-marketplace-8g5b7\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.928025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-catalog-content\") pod \"redhat-marketplace-8g5b7\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.928085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfzht\" (UniqueName: \"kubernetes.io/projected/c9ecee46-1449-46e9-a0d5-de5e171d8b69-kube-api-access-vfzht\") pod \"redhat-marketplace-8g5b7\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.928742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-utilities\") pod \"redhat-marketplace-8g5b7\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.929032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-catalog-content\") pod \"redhat-marketplace-8g5b7\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:16 crc kubenswrapper[4707]: I0129 03:42:16.958341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfzht\" (UniqueName: \"kubernetes.io/projected/c9ecee46-1449-46e9-a0d5-de5e171d8b69-kube-api-access-vfzht\") pod \"redhat-marketplace-8g5b7\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:17 crc kubenswrapper[4707]: I0129 03:42:17.033646 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:17 crc kubenswrapper[4707]: I0129 03:42:17.300729 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8g5b7"] Jan 29 03:42:17 crc kubenswrapper[4707]: I0129 03:42:17.782872 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerID="aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60" exitCode=0 Jan 29 03:42:17 crc kubenswrapper[4707]: I0129 03:42:17.782923 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5b7" event={"ID":"c9ecee46-1449-46e9-a0d5-de5e171d8b69","Type":"ContainerDied","Data":"aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60"} Jan 29 03:42:17 crc kubenswrapper[4707]: I0129 03:42:17.782956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5b7" event={"ID":"c9ecee46-1449-46e9-a0d5-de5e171d8b69","Type":"ContainerStarted","Data":"df84215e56fc2d9534fdbbd5248499d737215d37b0dacdb5eae3f84891e873aa"} Jan 29 03:42:17 crc kubenswrapper[4707]: I0129 03:42:17.792531 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6955d4df64-rgqtz" Jan 29 03:42:20 crc kubenswrapper[4707]: I0129 03:42:20.806364 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerID="268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9" exitCode=0 Jan 29 03:42:20 crc kubenswrapper[4707]: I0129 03:42:20.806467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5b7" event={"ID":"c9ecee46-1449-46e9-a0d5-de5e171d8b69","Type":"ContainerDied","Data":"268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9"} Jan 29 03:42:21 crc kubenswrapper[4707]: I0129 03:42:21.818099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5b7" event={"ID":"c9ecee46-1449-46e9-a0d5-de5e171d8b69","Type":"ContainerStarted","Data":"37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7"} Jan 29 03:42:21 crc kubenswrapper[4707]: I0129 03:42:21.840119 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8g5b7" podStartSLOduration=2.3982810949999998 podStartE2EDuration="5.840091054s" podCreationTimestamp="2026-01-29 03:42:16 +0000 UTC" firstStartedPulling="2026-01-29 03:42:17.786893038 +0000 UTC m=+891.271121943" lastFinishedPulling="2026-01-29 03:42:21.228702997 +0000 UTC m=+894.712931902" observedRunningTime="2026-01-29 03:42:21.836217166 +0000 UTC m=+895.320446071" watchObservedRunningTime="2026-01-29 03:42:21.840091054 +0000 UTC m=+895.324319969" Jan 29 03:42:27 crc kubenswrapper[4707]: I0129 03:42:27.033985 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:27 crc kubenswrapper[4707]: I0129 03:42:27.034454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:27 crc kubenswrapper[4707]: I0129 03:42:27.076467 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:27 crc kubenswrapper[4707]: I0129 03:42:27.914822 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:27 crc kubenswrapper[4707]: I0129 03:42:27.974205 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8g5b7"] Jan 29 03:42:29 crc kubenswrapper[4707]: I0129 03:42:29.880841 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8g5b7" podUID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerName="registry-server" containerID="cri-o://37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7" gracePeriod=2 Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.295066 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.437656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-catalog-content\") pod \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.437853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-utilities\") pod \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.437988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfzht\" (UniqueName: \"kubernetes.io/projected/c9ecee46-1449-46e9-a0d5-de5e171d8b69-kube-api-access-vfzht\") pod \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\" (UID: \"c9ecee46-1449-46e9-a0d5-de5e171d8b69\") " Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.438987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-utilities" (OuterVolumeSpecName: "utilities") pod "c9ecee46-1449-46e9-a0d5-de5e171d8b69" (UID: "c9ecee46-1449-46e9-a0d5-de5e171d8b69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.444332 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ecee46-1449-46e9-a0d5-de5e171d8b69-kube-api-access-vfzht" (OuterVolumeSpecName: "kube-api-access-vfzht") pod "c9ecee46-1449-46e9-a0d5-de5e171d8b69" (UID: "c9ecee46-1449-46e9-a0d5-de5e171d8b69"). InnerVolumeSpecName "kube-api-access-vfzht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.460496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9ecee46-1449-46e9-a0d5-de5e171d8b69" (UID: "c9ecee46-1449-46e9-a0d5-de5e171d8b69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.539260 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfzht\" (UniqueName: \"kubernetes.io/projected/c9ecee46-1449-46e9-a0d5-de5e171d8b69-kube-api-access-vfzht\") on node \"crc\" DevicePath \"\"" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.539311 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.539320 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ecee46-1449-46e9-a0d5-de5e171d8b69-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.890113 4707 generic.go:334] "Generic (PLEG): container finished" podID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerID="37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7" exitCode=0 Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.890236 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8g5b7" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.890211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5b7" event={"ID":"c9ecee46-1449-46e9-a0d5-de5e171d8b69","Type":"ContainerDied","Data":"37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7"} Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.890352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5b7" event={"ID":"c9ecee46-1449-46e9-a0d5-de5e171d8b69","Type":"ContainerDied","Data":"df84215e56fc2d9534fdbbd5248499d737215d37b0dacdb5eae3f84891e873aa"} Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.890404 4707 scope.go:117] "RemoveContainer" containerID="37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.908474 4707 scope.go:117] "RemoveContainer" containerID="268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.927212 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8g5b7"] Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.927326 4707 scope.go:117] "RemoveContainer" containerID="aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.932446 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8g5b7"] Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.952457 4707 scope.go:117] "RemoveContainer" containerID="37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7" Jan 29 03:42:30 crc kubenswrapper[4707]: E0129 03:42:30.952956 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7\": container with ID starting with 37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7 not found: ID does not exist" containerID="37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.952999 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7"} err="failed to get container status \"37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7\": rpc error: code = NotFound desc = could not find container \"37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7\": container with ID starting with 37f47a89fca3da1976b921c828ead6ab7a343b462af88e699a725a958da0cbc7 not found: ID does not exist" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.953031 4707 scope.go:117] "RemoveContainer" containerID="268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9" Jan 29 03:42:30 crc kubenswrapper[4707]: E0129 03:42:30.953552 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9\": container with ID starting with 268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9 not found: ID does not exist" containerID="268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.953585 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9"} err="failed to get container status \"268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9\": rpc error: code = NotFound desc = could not find container \"268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9\": container with ID starting with 268822847d3b84176947e18ffdb0f205a15841301f4b36c7b7ee0f4c196c51a9 not found: ID does not exist" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.953602 4707 scope.go:117] "RemoveContainer" containerID="aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60" Jan 29 03:42:30 crc kubenswrapper[4707]: E0129 03:42:30.953981 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60\": container with ID starting with aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60 not found: ID does not exist" containerID="aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60" Jan 29 03:42:30 crc kubenswrapper[4707]: I0129 03:42:30.954013 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60"} err="failed to get container status \"aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60\": rpc error: code = NotFound desc = could not find container \"aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60\": container with ID starting with aa7f3fb7665bea2a2e5325deb8ca8b16419816d0c05c62187126d4a412ed9d60 not found: ID does not exist" Jan 29 03:42:31 crc kubenswrapper[4707]: I0129 03:42:31.251296 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" path="/var/lib/kubelet/pods/c9ecee46-1449-46e9-a0d5-de5e171d8b69/volumes" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.724505 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cqb8x"] Jan 29 03:42:32 crc kubenswrapper[4707]: E0129 03:42:32.724947 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerName="extract-utilities" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.724971 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerName="extract-utilities" Jan 29 03:42:32 crc kubenswrapper[4707]: E0129 03:42:32.725009 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerName="registry-server" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.725022 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerName="registry-server" Jan 29 03:42:32 crc kubenswrapper[4707]: E0129 03:42:32.725041 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerName="extract-content" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.725052 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerName="extract-content" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.725220 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ecee46-1449-46e9-a0d5-de5e171d8b69" containerName="registry-server" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.726818 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.747940 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cqb8x"] Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.871358 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-catalog-content\") pod \"community-operators-cqb8x\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.871534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-utilities\") pod \"community-operators-cqb8x\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.871780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwpwq\" (UniqueName: \"kubernetes.io/projected/3129e577-ba51-4ba9-87e6-7f3bc57cef20-kube-api-access-rwpwq\") pod \"community-operators-cqb8x\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.973279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-catalog-content\") pod \"community-operators-cqb8x\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.973391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-utilities\") pod \"community-operators-cqb8x\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.973443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwpwq\" (UniqueName: \"kubernetes.io/projected/3129e577-ba51-4ba9-87e6-7f3bc57cef20-kube-api-access-rwpwq\") pod \"community-operators-cqb8x\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.974278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-utilities\") pod \"community-operators-cqb8x\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.974375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-catalog-content\") pod \"community-operators-cqb8x\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:32 crc kubenswrapper[4707]: I0129 03:42:32.994713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwpwq\" (UniqueName: \"kubernetes.io/projected/3129e577-ba51-4ba9-87e6-7f3bc57cef20-kube-api-access-rwpwq\") pod \"community-operators-cqb8x\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:33 crc kubenswrapper[4707]: I0129 03:42:33.045758 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:33 crc kubenswrapper[4707]: I0129 03:42:33.463828 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:42:33 crc kubenswrapper[4707]: I0129 03:42:33.464197 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:42:33 crc kubenswrapper[4707]: I0129 03:42:33.555322 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cqb8x"] Jan 29 03:42:33 crc kubenswrapper[4707]: I0129 03:42:33.914416 4707 generic.go:334] "Generic (PLEG): container finished" podID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerID="694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45" exitCode=0 Jan 29 03:42:33 crc kubenswrapper[4707]: I0129 03:42:33.914499 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqb8x" event={"ID":"3129e577-ba51-4ba9-87e6-7f3bc57cef20","Type":"ContainerDied","Data":"694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45"} Jan 29 03:42:33 crc kubenswrapper[4707]: I0129 03:42:33.914572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqb8x" event={"ID":"3129e577-ba51-4ba9-87e6-7f3bc57cef20","Type":"ContainerStarted","Data":"68130421edb84bf120ea848b4c0f5413c68762545d4e1e3287e0a249adc62e25"} Jan 29 03:42:34 crc kubenswrapper[4707]: I0129 03:42:34.923206 4707 generic.go:334] "Generic (PLEG): container finished" podID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerID="365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739" exitCode=0 Jan 29 03:42:34 crc kubenswrapper[4707]: I0129 03:42:34.923283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqb8x" event={"ID":"3129e577-ba51-4ba9-87e6-7f3bc57cef20","Type":"ContainerDied","Data":"365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739"} Jan 29 03:42:35 crc kubenswrapper[4707]: I0129 03:42:35.933495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqb8x" event={"ID":"3129e577-ba51-4ba9-87e6-7f3bc57cef20","Type":"ContainerStarted","Data":"b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64"} Jan 29 03:42:35 crc kubenswrapper[4707]: I0129 03:42:35.959834 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cqb8x" podStartSLOduration=2.556917463 podStartE2EDuration="3.959804201s" podCreationTimestamp="2026-01-29 03:42:32 +0000 UTC" firstStartedPulling="2026-01-29 03:42:33.917429191 +0000 UTC m=+907.401658096" lastFinishedPulling="2026-01-29 03:42:35.320315929 +0000 UTC m=+908.804544834" observedRunningTime="2026-01-29 03:42:35.953463673 +0000 UTC m=+909.437692578" watchObservedRunningTime="2026-01-29 03:42:35.959804201 +0000 UTC m=+909.444033106" Jan 29 03:42:43 crc kubenswrapper[4707]: I0129 03:42:43.046653 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:43 crc kubenswrapper[4707]: I0129 03:42:43.047430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:43 crc kubenswrapper[4707]: I0129 03:42:43.088112 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:44 crc kubenswrapper[4707]: I0129 03:42:44.026697 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:44 crc kubenswrapper[4707]: I0129 03:42:44.071470 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cqb8x"] Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.001314 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cqb8x" podUID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerName="registry-server" containerID="cri-o://b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64" gracePeriod=2 Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.367328 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.515427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-catalog-content\") pod \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.515563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-utilities\") pod \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.515627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwpwq\" (UniqueName: \"kubernetes.io/projected/3129e577-ba51-4ba9-87e6-7f3bc57cef20-kube-api-access-rwpwq\") pod \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\" (UID: \"3129e577-ba51-4ba9-87e6-7f3bc57cef20\") " Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.519241 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-utilities" (OuterVolumeSpecName: "utilities") pod "3129e577-ba51-4ba9-87e6-7f3bc57cef20" (UID: "3129e577-ba51-4ba9-87e6-7f3bc57cef20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.526591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3129e577-ba51-4ba9-87e6-7f3bc57cef20-kube-api-access-rwpwq" (OuterVolumeSpecName: "kube-api-access-rwpwq") pod "3129e577-ba51-4ba9-87e6-7f3bc57cef20" (UID: "3129e577-ba51-4ba9-87e6-7f3bc57cef20"). InnerVolumeSpecName "kube-api-access-rwpwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.567014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3129e577-ba51-4ba9-87e6-7f3bc57cef20" (UID: "3129e577-ba51-4ba9-87e6-7f3bc57cef20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.617689 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwpwq\" (UniqueName: \"kubernetes.io/projected/3129e577-ba51-4ba9-87e6-7f3bc57cef20-kube-api-access-rwpwq\") on node \"crc\" DevicePath \"\"" Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.617732 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:42:46 crc kubenswrapper[4707]: I0129 03:42:46.617746 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3129e577-ba51-4ba9-87e6-7f3bc57cef20-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.010967 4707 generic.go:334] "Generic (PLEG): container finished" podID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerID="b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64" exitCode=0 Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.011040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqb8x" event={"ID":"3129e577-ba51-4ba9-87e6-7f3bc57cef20","Type":"ContainerDied","Data":"b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64"} Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.011081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqb8x" event={"ID":"3129e577-ba51-4ba9-87e6-7f3bc57cef20","Type":"ContainerDied","Data":"68130421edb84bf120ea848b4c0f5413c68762545d4e1e3287e0a249adc62e25"} Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.011107 4707 scope.go:117] "RemoveContainer" containerID="b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.011385 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqb8x" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.030421 4707 scope.go:117] "RemoveContainer" containerID="365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.054193 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cqb8x"] Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.061961 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cqb8x"] Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.072506 4707 scope.go:117] "RemoveContainer" containerID="694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.089347 4707 scope.go:117] "RemoveContainer" containerID="b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64" Jan 29 03:42:47 crc kubenswrapper[4707]: E0129 03:42:47.090131 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64\": container with ID starting with b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64 not found: ID does not exist" containerID="b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.090186 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64"} err="failed to get container status \"b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64\": rpc error: code = NotFound desc = could not find container \"b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64\": container with ID starting with b98e1a766c02d961ca0fe6ebb453626d9868a3652ba7f89206c4d12ad70f7c64 not found: ID does not exist" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.090223 4707 scope.go:117] "RemoveContainer" containerID="365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739" Jan 29 03:42:47 crc kubenswrapper[4707]: E0129 03:42:47.090778 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739\": container with ID starting with 365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739 not found: ID does not exist" containerID="365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.090923 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739"} err="failed to get container status \"365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739\": rpc error: code = NotFound desc = could not find container \"365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739\": container with ID starting with 365bb78899ae6e6ac05cad917faec87404d59d5d5ff2ec1a7ae4832afe095739 not found: ID does not exist" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.091036 4707 scope.go:117] "RemoveContainer" containerID="694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45" Jan 29 03:42:47 crc kubenswrapper[4707]: E0129 03:42:47.091872 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45\": container with ID starting with 694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45 not found: ID does not exist" containerID="694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.091938 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45"} err="failed to get container status \"694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45\": rpc error: code = NotFound desc = could not find container \"694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45\": container with ID starting with 694dcb2c78e0197a3ad3a02decb172fa852a50891db5b3509d00f0c17a7c2d45 not found: ID does not exist" Jan 29 03:42:47 crc kubenswrapper[4707]: I0129 03:42:47.251943 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" path="/var/lib/kubelet/pods/3129e577-ba51-4ba9-87e6-7f3bc57cef20/volumes" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.074691 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh"] Jan 29 03:42:54 crc kubenswrapper[4707]: E0129 03:42:54.075718 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerName="extract-content" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.075733 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerName="extract-content" Jan 29 03:42:54 crc kubenswrapper[4707]: E0129 03:42:54.075748 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerName="registry-server" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.075755 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerName="registry-server" Jan 29 03:42:54 crc kubenswrapper[4707]: E0129 03:42:54.075763 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerName="extract-utilities" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.075771 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerName="extract-utilities" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.075894 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3129e577-ba51-4ba9-87e6-7f3bc57cef20" containerName="registry-server" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.076399 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.081072 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-z5qpk" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.084317 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.085593 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.088682 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-tjcw5" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.097465 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.098860 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.104222 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.111450 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.113451 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-jzktf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.121188 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.127697 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.128994 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.134673 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-x979b" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.138894 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.139865 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.169034 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-h6nn2" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.229424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f9tr\" (UniqueName: \"kubernetes.io/projected/155a3715-4600-4f83-8db3-a6beaf5c3394-kube-api-access-9f9tr\") pod \"cinder-operator-controller-manager-f6487bd57-qbdg4\" (UID: \"155a3715-4600-4f83-8db3-a6beaf5c3394\") " pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.229519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n957p\" (UniqueName: \"kubernetes.io/projected/3445268f-15c8-4438-8fd1-a13d2bd9981d-kube-api-access-n957p\") pod \"glance-operator-controller-manager-6db5dbd896-c2sgx\" (UID: \"3445268f-15c8-4438-8fd1-a13d2bd9981d\") " pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.229564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7zw6\" (UniqueName: \"kubernetes.io/projected/6dbe27ba-a451-4202-8f58-73cb0684bfea-kube-api-access-f7zw6\") pod \"barbican-operator-controller-manager-6bc7f4f4cf-9xlqh\" (UID: \"6dbe27ba-a451-4202-8f58-73cb0684bfea\") " pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.229588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpk49\" (UniqueName: \"kubernetes.io/projected/f064b8fa-dd53-4fd8-8440-9e517b1c1279-kube-api-access-rpk49\") pod \"designate-operator-controller-manager-66dfbd6f5d-jlsmf\" (UID: \"f064b8fa-dd53-4fd8-8440-9e517b1c1279\") " pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.235930 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.237331 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.252625 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-4wck9"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.254140 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.255867 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-kn7sz" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.259863 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jqvgs" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.260020 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.277630 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.317072 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.331128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f9tr\" (UniqueName: \"kubernetes.io/projected/155a3715-4600-4f83-8db3-a6beaf5c3394-kube-api-access-9f9tr\") pod \"cinder-operator-controller-manager-f6487bd57-qbdg4\" (UID: \"155a3715-4600-4f83-8db3-a6beaf5c3394\") " pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.331211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbncl\" (UniqueName: \"kubernetes.io/projected/de21d951-1d0b-415e-8923-5fa2cc58e439-kube-api-access-hbncl\") pod \"heat-operator-controller-manager-587c6bfdcf-c8v2v\" (UID: \"de21d951-1d0b-415e-8923-5fa2cc58e439\") " pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.331254 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n957p\" (UniqueName: \"kubernetes.io/projected/3445268f-15c8-4438-8fd1-a13d2bd9981d-kube-api-access-n957p\") pod \"glance-operator-controller-manager-6db5dbd896-c2sgx\" (UID: \"3445268f-15c8-4438-8fd1-a13d2bd9981d\") " pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.331280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7zw6\" (UniqueName: \"kubernetes.io/projected/6dbe27ba-a451-4202-8f58-73cb0684bfea-kube-api-access-f7zw6\") pod \"barbican-operator-controller-manager-6bc7f4f4cf-9xlqh\" (UID: \"6dbe27ba-a451-4202-8f58-73cb0684bfea\") " pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.331304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpk49\" (UniqueName: \"kubernetes.io/projected/f064b8fa-dd53-4fd8-8440-9e517b1c1279-kube-api-access-rpk49\") pod \"designate-operator-controller-manager-66dfbd6f5d-jlsmf\" (UID: \"f064b8fa-dd53-4fd8-8440-9e517b1c1279\") " pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.388634 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-958664b5-qgj46"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.390317 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.397717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f9tr\" (UniqueName: \"kubernetes.io/projected/155a3715-4600-4f83-8db3-a6beaf5c3394-kube-api-access-9f9tr\") pod \"cinder-operator-controller-manager-f6487bd57-qbdg4\" (UID: \"155a3715-4600-4f83-8db3-a6beaf5c3394\") " pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.398658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7zw6\" (UniqueName: \"kubernetes.io/projected/6dbe27ba-a451-4202-8f58-73cb0684bfea-kube-api-access-f7zw6\") pod \"barbican-operator-controller-manager-6bc7f4f4cf-9xlqh\" (UID: \"6dbe27ba-a451-4202-8f58-73cb0684bfea\") " pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.402335 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpk49\" (UniqueName: \"kubernetes.io/projected/f064b8fa-dd53-4fd8-8440-9e517b1c1279-kube-api-access-rpk49\") pod \"designate-operator-controller-manager-66dfbd6f5d-jlsmf\" (UID: \"f064b8fa-dd53-4fd8-8440-9e517b1c1279\") " pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.403345 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-bcckn" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.408478 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.418927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.430768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n957p\" (UniqueName: \"kubernetes.io/projected/3445268f-15c8-4438-8fd1-a13d2bd9981d-kube-api-access-n957p\") pod \"glance-operator-controller-manager-6db5dbd896-c2sgx\" (UID: \"3445268f-15c8-4438-8fd1-a13d2bd9981d\") " pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.450394 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.451772 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.452963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78gzw\" (UniqueName: \"kubernetes.io/projected/09633ead-78c6-4934-95c2-05b24c6fc3e5-kube-api-access-78gzw\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.453123 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc8s2\" (UniqueName: \"kubernetes.io/projected/59a9dc92-c9db-4bfa-8233-88b1690beaad-kube-api-access-qc8s2\") pod \"horizon-operator-controller-manager-5fb775575f-z2vvp\" (UID: \"59a9dc92-c9db-4bfa-8233-88b1690beaad\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.453206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.453293 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbncl\" (UniqueName: \"kubernetes.io/projected/de21d951-1d0b-415e-8923-5fa2cc58e439-kube-api-access-hbncl\") pod \"heat-operator-controller-manager-587c6bfdcf-c8v2v\" (UID: \"de21d951-1d0b-415e-8923-5fa2cc58e439\") " pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.473510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.480008 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-5hnjb" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.493957 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-4wck9"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.512720 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.516386 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-958664b5-qgj46"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.524893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbncl\" (UniqueName: \"kubernetes.io/projected/de21d951-1d0b-415e-8923-5fa2cc58e439-kube-api-access-hbncl\") pod \"heat-operator-controller-manager-587c6bfdcf-c8v2v\" (UID: \"de21d951-1d0b-415e-8923-5fa2cc58e439\") " pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.531550 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-765668569f-jq2z4"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.534160 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.556961 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-ftc7l" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.558075 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6pv\" (UniqueName: \"kubernetes.io/projected/8255e85b-8815-4860-9325-7570ba9a6fd9-kube-api-access-tl6pv\") pod \"ironic-operator-controller-manager-958664b5-qgj46\" (UID: \"8255e85b-8815-4860-9325-7570ba9a6fd9\") " pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.558141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc8s2\" (UniqueName: \"kubernetes.io/projected/59a9dc92-c9db-4bfa-8233-88b1690beaad-kube-api-access-qc8s2\") pod \"horizon-operator-controller-manager-5fb775575f-z2vvp\" (UID: \"59a9dc92-c9db-4bfa-8233-88b1690beaad\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.558167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.558200 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78gzw\" (UniqueName: \"kubernetes.io/projected/09633ead-78c6-4934-95c2-05b24c6fc3e5-kube-api-access-78gzw\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:42:54 crc kubenswrapper[4707]: E0129 03:42:54.558478 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 03:42:54 crc kubenswrapper[4707]: E0129 03:42:54.558524 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert podName:09633ead-78c6-4934-95c2-05b24c6fc3e5 nodeName:}" failed. No retries permitted until 2026-01-29 03:42:55.058507182 +0000 UTC m=+928.542736087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert") pod "infra-operator-controller-manager-79955696d6-4wck9" (UID: "09633ead-78c6-4934-95c2-05b24c6fc3e5") : secret "infra-operator-webhook-server-cert" not found Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.567177 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.575936 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-765668569f-jq2z4"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.582734 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.583781 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.591039 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5j7wn" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.605500 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.607164 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.615088 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-225zs" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.615566 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.616439 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.617762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc8s2\" (UniqueName: \"kubernetes.io/projected/59a9dc92-c9db-4bfa-8233-88b1690beaad-kube-api-access-qc8s2\") pod \"horizon-operator-controller-manager-5fb775575f-z2vvp\" (UID: \"59a9dc92-c9db-4bfa-8233-88b1690beaad\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.622620 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.644613 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.644705 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mk4dk" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.644717 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.655266 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78gzw\" (UniqueName: \"kubernetes.io/projected/09633ead-78c6-4934-95c2-05b24c6fc3e5-kube-api-access-78gzw\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.662886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6pv\" (UniqueName: \"kubernetes.io/projected/8255e85b-8815-4860-9325-7570ba9a6fd9-kube-api-access-tl6pv\") pod \"ironic-operator-controller-manager-958664b5-qgj46\" (UID: \"8255e85b-8815-4860-9325-7570ba9a6fd9\") " pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.662960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvqw\" (UniqueName: \"kubernetes.io/projected/02f283f2-5bf1-4ee7-ac34-751ffc96421c-kube-api-access-dcvqw\") pod \"manila-operator-controller-manager-765668569f-jq2z4\" (UID: \"02f283f2-5bf1-4ee7-ac34-751ffc96421c\") " pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.663001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9b6z\" (UniqueName: \"kubernetes.io/projected/6fa97c9f-4b04-4795-9f11-9790c692ba0f-kube-api-access-b9b6z\") pod \"keystone-operator-controller-manager-6978b79747-49nhf\" (UID: \"6fa97c9f-4b04-4795-9f11-9790c692ba0f\") " pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.663039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2m2p\" (UniqueName: \"kubernetes.io/projected/55320b88-8f86-47bd-8718-6cabd0865a1c-kube-api-access-p2m2p\") pod \"mariadb-operator-controller-manager-67bf948998-jvdtd\" (UID: \"55320b88-8f86-47bd-8718-6cabd0865a1c\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.677982 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.678828 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.696576 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.698157 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.705030 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.705940 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wbb8c" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.705999 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-hr847" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.706216 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.714611 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.715870 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6pv\" (UniqueName: \"kubernetes.io/projected/8255e85b-8815-4860-9325-7570ba9a6fd9-kube-api-access-tl6pv\") pod \"ironic-operator-controller-manager-958664b5-qgj46\" (UID: \"8255e85b-8815-4860-9325-7570ba9a6fd9\") " pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.721271 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.729919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.736187 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vzdbv" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.742865 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.744103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.749601 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.755093 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-d6v48" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.755299 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.762689 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.764463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvqw\" (UniqueName: \"kubernetes.io/projected/02f283f2-5bf1-4ee7-ac34-751ffc96421c-kube-api-access-dcvqw\") pod \"manila-operator-controller-manager-765668569f-jq2z4\" (UID: \"02f283f2-5bf1-4ee7-ac34-751ffc96421c\") " pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.764496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9b6z\" (UniqueName: \"kubernetes.io/projected/6fa97c9f-4b04-4795-9f11-9790c692ba0f-kube-api-access-b9b6z\") pod \"keystone-operator-controller-manager-6978b79747-49nhf\" (UID: \"6fa97c9f-4b04-4795-9f11-9790c692ba0f\") " pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.765044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.769184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2m2p\" (UniqueName: \"kubernetes.io/projected/55320b88-8f86-47bd-8718-6cabd0865a1c-kube-api-access-p2m2p\") pod \"mariadb-operator-controller-manager-67bf948998-jvdtd\" (UID: \"55320b88-8f86-47bd-8718-6cabd0865a1c\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.769261 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmpt\" (UniqueName: \"kubernetes.io/projected/7d2c1f08-0b63-4368-a7cc-9374d0dbf035-kube-api-access-9wmpt\") pod \"neutron-operator-controller-manager-694c5bfc85-g6dzc\" (UID: \"7d2c1f08-0b63-4368-a7cc-9374d0dbf035\") " pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.769283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ffs\" (UniqueName: \"kubernetes.io/projected/e15cd320-f902-4d99-8037-5c9355f4a833-kube-api-access-28ffs\") pod \"nova-operator-controller-manager-ddcbfd695-mrfw2\" (UID: \"e15cd320-f902-4d99-8037-5c9355f4a833\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.776048 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-k4sbj" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.796283 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.810772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvqw\" (UniqueName: \"kubernetes.io/projected/02f283f2-5bf1-4ee7-ac34-751ffc96421c-kube-api-access-dcvqw\") pod \"manila-operator-controller-manager-765668569f-jq2z4\" (UID: \"02f283f2-5bf1-4ee7-ac34-751ffc96421c\") " pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.830758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.846287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9b6z\" (UniqueName: \"kubernetes.io/projected/6fa97c9f-4b04-4795-9f11-9790c692ba0f-kube-api-access-b9b6z\") pod \"keystone-operator-controller-manager-6978b79747-49nhf\" (UID: \"6fa97c9f-4b04-4795-9f11-9790c692ba0f\") " pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.848805 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2m2p\" (UniqueName: \"kubernetes.io/projected/55320b88-8f86-47bd-8718-6cabd0865a1c-kube-api-access-p2m2p\") pod \"mariadb-operator-controller-manager-67bf948998-jvdtd\" (UID: \"55320b88-8f86-47bd-8718-6cabd0865a1c\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.870327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmpt\" (UniqueName: \"kubernetes.io/projected/7d2c1f08-0b63-4368-a7cc-9374d0dbf035-kube-api-access-9wmpt\") pod \"neutron-operator-controller-manager-694c5bfc85-g6dzc\" (UID: \"7d2c1f08-0b63-4368-a7cc-9374d0dbf035\") " pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.870884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ffs\" (UniqueName: \"kubernetes.io/projected/e15cd320-f902-4d99-8037-5c9355f4a833-kube-api-access-28ffs\") pod \"nova-operator-controller-manager-ddcbfd695-mrfw2\" (UID: \"e15cd320-f902-4d99-8037-5c9355f4a833\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.870974 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.871050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nsx9\" (UniqueName: \"kubernetes.io/projected/e0372a1a-cd84-491e-a3d1-f58389a66b63-kube-api-access-7nsx9\") pod \"ovn-operator-controller-manager-788c46999f-4cp2t\" (UID: \"e0372a1a-cd84-491e-a3d1-f58389a66b63\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.871141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn2tb\" (UniqueName: \"kubernetes.io/projected/f57d529d-1352-47d9-baa8-a2f383374b35-kube-api-access-nn2tb\") pod \"placement-operator-controller-manager-5b964cf4cd-wblrt\" (UID: \"f57d529d-1352-47d9-baa8-a2f383374b35\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.871229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpsmt\" (UniqueName: \"kubernetes.io/projected/a6df0676-63de-4a83-bc60-9b69a2f8777f-kube-api-access-qpsmt\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.871303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8w2c\" (UniqueName: \"kubernetes.io/projected/26d2ace7-4405-480c-acf8-233e1511007f-kube-api-access-t8w2c\") pod \"octavia-operator-controller-manager-5c765b4558-5jtr8\" (UID: \"26d2ace7-4405-480c-acf8-233e1511007f\") " pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.871433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlb4\" (UniqueName: \"kubernetes.io/projected/b7b5c12b-680b-4814-906c-62c9f8702559-kube-api-access-6qlb4\") pod \"swift-operator-controller-manager-68fc8c869-zmvww\" (UID: \"b7b5c12b-680b-4814-906c-62c9f8702559\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.876680 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.877343 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.882244 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.888727 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.889847 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.895427 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-27zxq" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.896965 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.923450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmpt\" (UniqueName: \"kubernetes.io/projected/7d2c1f08-0b63-4368-a7cc-9374d0dbf035-kube-api-access-9wmpt\") pod \"neutron-operator-controller-manager-694c5bfc85-g6dzc\" (UID: \"7d2c1f08-0b63-4368-a7cc-9374d0dbf035\") " pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.926983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq"] Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.927169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ffs\" (UniqueName: \"kubernetes.io/projected/e15cd320-f902-4d99-8037-5c9355f4a833-kube-api-access-28ffs\") pod \"nova-operator-controller-manager-ddcbfd695-mrfw2\" (UID: \"e15cd320-f902-4d99-8037-5c9355f4a833\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.965208 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.973934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nsx9\" (UniqueName: \"kubernetes.io/projected/e0372a1a-cd84-491e-a3d1-f58389a66b63-kube-api-access-7nsx9\") pod \"ovn-operator-controller-manager-788c46999f-4cp2t\" (UID: \"e0372a1a-cd84-491e-a3d1-f58389a66b63\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.973985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn2tb\" (UniqueName: \"kubernetes.io/projected/f57d529d-1352-47d9-baa8-a2f383374b35-kube-api-access-nn2tb\") pod \"placement-operator-controller-manager-5b964cf4cd-wblrt\" (UID: \"f57d529d-1352-47d9-baa8-a2f383374b35\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.974016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpsmt\" (UniqueName: \"kubernetes.io/projected/a6df0676-63de-4a83-bc60-9b69a2f8777f-kube-api-access-qpsmt\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.974041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8w2c\" (UniqueName: \"kubernetes.io/projected/26d2ace7-4405-480c-acf8-233e1511007f-kube-api-access-t8w2c\") pod \"octavia-operator-controller-manager-5c765b4558-5jtr8\" (UID: \"26d2ace7-4405-480c-acf8-233e1511007f\") " pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.974097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcd8h\" (UniqueName: \"kubernetes.io/projected/0a32b73c-f66f-425f-81a9-ef1cc36041d4-kube-api-access-fcd8h\") pod \"telemetry-operator-controller-manager-7886d5cc69-w8rzq\" (UID: \"0a32b73c-f66f-425f-81a9-ef1cc36041d4\") " pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.974155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlb4\" (UniqueName: \"kubernetes.io/projected/b7b5c12b-680b-4814-906c-62c9f8702559-kube-api-access-6qlb4\") pod \"swift-operator-controller-manager-68fc8c869-zmvww\" (UID: \"b7b5c12b-680b-4814-906c-62c9f8702559\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" Jan 29 03:42:54 crc kubenswrapper[4707]: I0129 03:42:54.974206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:42:54 crc kubenswrapper[4707]: E0129 03:42:54.974399 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:42:54 crc kubenswrapper[4707]: E0129 03:42:54.974469 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert podName:a6df0676-63de-4a83-bc60-9b69a2f8777f nodeName:}" failed. No retries permitted until 2026-01-29 03:42:55.47444772 +0000 UTC m=+928.958676625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" (UID: "a6df0676-63de-4a83-bc60-9b69a2f8777f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.001365 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.002522 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.005865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn2tb\" (UniqueName: \"kubernetes.io/projected/f57d529d-1352-47d9-baa8-a2f383374b35-kube-api-access-nn2tb\") pod \"placement-operator-controller-manager-5b964cf4cd-wblrt\" (UID: \"f57d529d-1352-47d9-baa8-a2f383374b35\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.006304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nsx9\" (UniqueName: \"kubernetes.io/projected/e0372a1a-cd84-491e-a3d1-f58389a66b63-kube-api-access-7nsx9\") pod \"ovn-operator-controller-manager-788c46999f-4cp2t\" (UID: \"e0372a1a-cd84-491e-a3d1-f58389a66b63\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.013670 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-5strm" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.067225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlb4\" (UniqueName: \"kubernetes.io/projected/b7b5c12b-680b-4814-906c-62c9f8702559-kube-api-access-6qlb4\") pod \"swift-operator-controller-manager-68fc8c869-zmvww\" (UID: \"b7b5c12b-680b-4814-906c-62c9f8702559\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.067965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8w2c\" (UniqueName: \"kubernetes.io/projected/26d2ace7-4405-480c-acf8-233e1511007f-kube-api-access-t8w2c\") pod \"octavia-operator-controller-manager-5c765b4558-5jtr8\" (UID: \"26d2ace7-4405-480c-acf8-233e1511007f\") " pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.074311 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.082934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhx4c\" (UniqueName: \"kubernetes.io/projected/6e5e159a-c89d-43cf-b9cf-4a92de09ac22-kube-api-access-lhx4c\") pod \"test-operator-controller-manager-56f8bfcd9f-lnfzj\" (UID: \"6e5e159a-c89d-43cf-b9cf-4a92de09ac22\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.083081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcd8h\" (UniqueName: \"kubernetes.io/projected/0a32b73c-f66f-425f-81a9-ef1cc36041d4-kube-api-access-fcd8h\") pod \"telemetry-operator-controller-manager-7886d5cc69-w8rzq\" (UID: \"0a32b73c-f66f-425f-81a9-ef1cc36041d4\") " pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.083235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.083435 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.083513 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert podName:09633ead-78c6-4934-95c2-05b24c6fc3e5 nodeName:}" failed. No retries permitted until 2026-01-29 03:42:56.083490536 +0000 UTC m=+929.567719431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert") pod "infra-operator-controller-manager-79955696d6-4wck9" (UID: "09633ead-78c6-4934-95c2-05b24c6fc3e5") : secret "infra-operator-webhook-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.084942 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.086746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.101497 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.126753 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.129573 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.135921 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpsmt\" (UniqueName: \"kubernetes.io/projected/a6df0676-63de-4a83-bc60-9b69a2f8777f-kube-api-access-qpsmt\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.137516 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-w2rc8" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.143629 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.144798 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcd8h\" (UniqueName: \"kubernetes.io/projected/0a32b73c-f66f-425f-81a9-ef1cc36041d4-kube-api-access-fcd8h\") pod \"telemetry-operator-controller-manager-7886d5cc69-w8rzq\" (UID: \"0a32b73c-f66f-425f-81a9-ef1cc36041d4\") " pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.146206 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.162614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.163125 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.185034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlrn6\" (UniqueName: \"kubernetes.io/projected/706ea7e5-d8b2-4bc1-900b-d62dddcad89e-kube-api-access-wlrn6\") pod \"watcher-operator-controller-manager-767b8bc766-mhnm2\" (UID: \"706ea7e5-d8b2-4bc1-900b-d62dddcad89e\") " pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.185480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhx4c\" (UniqueName: \"kubernetes.io/projected/6e5e159a-c89d-43cf-b9cf-4a92de09ac22-kube-api-access-lhx4c\") pod \"test-operator-controller-manager-56f8bfcd9f-lnfzj\" (UID: \"6e5e159a-c89d-43cf-b9cf-4a92de09ac22\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.207909 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.209172 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.214228 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.215137 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-s45xq" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.231407 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.235324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhx4c\" (UniqueName: \"kubernetes.io/projected/6e5e159a-c89d-43cf-b9cf-4a92de09ac22-kube-api-access-lhx4c\") pod \"test-operator-controller-manager-56f8bfcd9f-lnfzj\" (UID: \"6e5e159a-c89d-43cf-b9cf-4a92de09ac22\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.270123 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.286364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzfrx\" (UniqueName: \"kubernetes.io/projected/d938abde-b4d6-4d4e-a176-9ed92ac5325d-kube-api-access-fzfrx\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.286723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlrn6\" (UniqueName: \"kubernetes.io/projected/706ea7e5-d8b2-4bc1-900b-d62dddcad89e-kube-api-access-wlrn6\") pod \"watcher-operator-controller-manager-767b8bc766-mhnm2\" (UID: \"706ea7e5-d8b2-4bc1-900b-d62dddcad89e\") " pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.286915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.287067 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.289506 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.305821 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.328517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlrn6\" (UniqueName: \"kubernetes.io/projected/706ea7e5-d8b2-4bc1-900b-d62dddcad89e-kube-api-access-wlrn6\") pod \"watcher-operator-controller-manager-767b8bc766-mhnm2\" (UID: \"706ea7e5-d8b2-4bc1-900b-d62dddcad89e\") " pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.348071 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.349292 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.354213 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-874wf" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.393847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.393915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.393968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzfrx\" (UniqueName: \"kubernetes.io/projected/d938abde-b4d6-4d4e-a176-9ed92ac5325d-kube-api-access-fzfrx\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.394213 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.394318 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:42:55.894284257 +0000 UTC m=+929.378513162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "webhook-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.394505 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.394582 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:42:55.894561605 +0000 UTC m=+929.378790510 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "metrics-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.398027 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.424613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzfrx\" (UniqueName: \"kubernetes.io/projected/d938abde-b4d6-4d4e-a176-9ed92ac5325d-kube-api-access-fzfrx\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: W0129 03:42:55.433934 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3445268f_15c8_4438_8fd1_a13d2bd9981d.slice/crio-b0c1bb4a315d74fe011e2b03daf6eb6597d928abcd37601148af1d35d2112ca3 WatchSource:0}: Error finding container b0c1bb4a315d74fe011e2b03daf6eb6597d928abcd37601148af1d35d2112ca3: Status 404 returned error can't find the container with id b0c1bb4a315d74fe011e2b03daf6eb6597d928abcd37601148af1d35d2112ca3 Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.435384 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.451831 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx"] Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.486547 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.496651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.496802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lb6z\" (UniqueName: \"kubernetes.io/projected/6fb86866-7c9d-4b4f-bf81-8a36898aca3d-kube-api-access-9lb6z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b6ggj\" (UID: \"6fb86866-7c9d-4b4f-bf81-8a36898aca3d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.497037 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.497082 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert podName:a6df0676-63de-4a83-bc60-9b69a2f8777f nodeName:}" failed. No retries permitted until 2026-01-29 03:42:56.497066518 +0000 UTC m=+929.981295413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" (UID: "a6df0676-63de-4a83-bc60-9b69a2f8777f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.533755 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.598207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lb6z\" (UniqueName: \"kubernetes.io/projected/6fb86866-7c9d-4b4f-bf81-8a36898aca3d-kube-api-access-9lb6z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b6ggj\" (UID: \"6fb86866-7c9d-4b4f-bf81-8a36898aca3d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.627439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lb6z\" (UniqueName: \"kubernetes.io/projected/6fb86866-7c9d-4b4f-bf81-8a36898aca3d-kube-api-access-9lb6z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-b6ggj\" (UID: \"6fb86866-7c9d-4b4f-bf81-8a36898aca3d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.644041 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4"] Jan 29 03:42:55 crc kubenswrapper[4707]: W0129 03:42:55.661879 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod155a3715_4600_4f83_8db3_a6beaf5c3394.slice/crio-73172fa7abf7d0ebe24525f163a1e3b53c33f773f4fd39127e0c47148b4a841e WatchSource:0}: Error finding container 73172fa7abf7d0ebe24525f163a1e3b53c33f773f4fd39127e0c47148b4a841e: Status 404 returned error can't find the container with id 73172fa7abf7d0ebe24525f163a1e3b53c33f773f4fd39127e0c47148b4a841e Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.688183 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh"] Jan 29 03:42:55 crc kubenswrapper[4707]: W0129 03:42:55.782731 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dbe27ba_a451_4202_8f58_73cb0684bfea.slice/crio-cd477867e2a5c8681ac95e9948ebf36b03dc39b5ee7b5995a1eb4d1fb63f8e8b WatchSource:0}: Error finding container cd477867e2a5c8681ac95e9948ebf36b03dc39b5ee7b5995a1eb4d1fb63f8e8b: Status 404 returned error can't find the container with id cd477867e2a5c8681ac95e9948ebf36b03dc39b5ee7b5995a1eb4d1fb63f8e8b Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.901449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.904684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.904755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.904982 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.905055 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:42:56.905032994 +0000 UTC m=+930.389261899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "metrics-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.905929 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: E0129 03:42:55.905969 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:42:56.90596006 +0000 UTC m=+930.390188965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "webhook-server-cert" not found Jan 29 03:42:55 crc kubenswrapper[4707]: I0129 03:42:55.924203 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v"] Jan 29 03:42:55 crc kubenswrapper[4707]: W0129 03:42:55.930850 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde21d951_1d0b_415e_8923_5fa2cc58e439.slice/crio-aebe4109b058be3f752cb9a34c2cbaad1676dad4910c1d195eb73019bfce3850 WatchSource:0}: Error finding container aebe4109b058be3f752cb9a34c2cbaad1676dad4910c1d195eb73019bfce3850: Status 404 returned error can't find the container with id aebe4109b058be3f752cb9a34c2cbaad1676dad4910c1d195eb73019bfce3850 Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.093638 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-765668569f-jq2z4"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.107017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.107274 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.107357 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert podName:09633ead-78c6-4934-95c2-05b24c6fc3e5 nodeName:}" failed. No retries permitted until 2026-01-29 03:42:58.107334885 +0000 UTC m=+931.591563790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert") pod "infra-operator-controller-manager-79955696d6-4wck9" (UID: "09633ead-78c6-4934-95c2-05b24c6fc3e5") : secret "infra-operator-webhook-server-cert" not found Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.108377 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.141419 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-958664b5-qgj46"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.141473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" event={"ID":"6dbe27ba-a451-4202-8f58-73cb0684bfea","Type":"ContainerStarted","Data":"cd477867e2a5c8681ac95e9948ebf36b03dc39b5ee7b5995a1eb4d1fb63f8e8b"} Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.144417 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.155504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" event={"ID":"de21d951-1d0b-415e-8923-5fa2cc58e439","Type":"ContainerStarted","Data":"aebe4109b058be3f752cb9a34c2cbaad1676dad4910c1d195eb73019bfce3850"} Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.157497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" event={"ID":"3445268f-15c8-4438-8fd1-a13d2bd9981d","Type":"ContainerStarted","Data":"b0c1bb4a315d74fe011e2b03daf6eb6597d928abcd37601148af1d35d2112ca3"} Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.164555 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" event={"ID":"155a3715-4600-4f83-8db3-a6beaf5c3394","Type":"ContainerStarted","Data":"73172fa7abf7d0ebe24525f163a1e3b53c33f773f4fd39127e0c47148b4a841e"} Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.168615 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" event={"ID":"f064b8fa-dd53-4fd8-8440-9e517b1c1279","Type":"ContainerStarted","Data":"f2807888998c88a5426944f3dc1bf62a6ec6cdde4b81006570794ae90905e8b9"} Jan 29 03:42:56 crc kubenswrapper[4707]: W0129 03:42:56.169977 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8255e85b_8815_4860_9325_7570ba9a6fd9.slice/crio-7de92e9c3e529d512984e3bdcbdfe91da86e5263544586340704cc19f24cda15 WatchSource:0}: Error finding container 7de92e9c3e529d512984e3bdcbdfe91da86e5263544586340704cc19f24cda15: Status 404 returned error can't find the container with id 7de92e9c3e529d512984e3bdcbdfe91da86e5263544586340704cc19f24cda15 Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.236065 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww"] Jan 29 03:42:56 crc kubenswrapper[4707]: W0129 03:42:56.239027 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a32b73c_f66f_425f_81a9_ef1cc36041d4.slice/crio-4ae9102310200b4f4a73314d8b38399354b67dc7484a52e249a542440cbca4a6 WatchSource:0}: Error finding container 4ae9102310200b4f4a73314d8b38399354b67dc7484a52e249a542440cbca4a6: Status 404 returned error can't find the container with id 4ae9102310200b4f4a73314d8b38399354b67dc7484a52e249a542440cbca4a6 Jan 29 03:42:56 crc kubenswrapper[4707]: W0129 03:42:56.251362 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d2c1f08_0b63_4368_a7cc_9374d0dbf035.slice/crio-ff48713b9a911b82b46ade401bdcb3c3581ac8629f2322043279ea078eeaf5b6 WatchSource:0}: Error finding container ff48713b9a911b82b46ade401bdcb3c3581ac8629f2322043279ea078eeaf5b6: Status 404 returned error can't find the container with id ff48713b9a911b82b46ade401bdcb3c3581ac8629f2322043279ea078eeaf5b6 Jan 29 03:42:56 crc kubenswrapper[4707]: W0129 03:42:56.260329 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode15cd320_f902_4d99_8037_5c9355f4a833.slice/crio-eb528fc99c5826c5b02bb87498def44285d6410413ce213362f69ee47b8f7eea WatchSource:0}: Error finding container eb528fc99c5826c5b02bb87498def44285d6410413ce213362f69ee47b8f7eea: Status 404 returned error can't find the container with id eb528fc99c5826c5b02bb87498def44285d6410413ce213362f69ee47b8f7eea Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.269330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc"] Jan 29 03:42:56 crc kubenswrapper[4707]: W0129 03:42:56.288027 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7b5c12b_680b_4814_906c_62c9f8702559.slice/crio-5c0df191f7913515c8c2e6d58fd1b22472416a3ffc56fa937898317ea4ac8208 WatchSource:0}: Error finding container 5c0df191f7913515c8c2e6d58fd1b22472416a3ffc56fa937898317ea4ac8208: Status 404 returned error can't find the container with id 5c0df191f7913515c8c2e6d58fd1b22472416a3ffc56fa937898317ea4ac8208 Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.300841 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.315079 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.322982 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.330652 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.407763 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.410675 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.433066 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t"] Jan 29 03:42:56 crc kubenswrapper[4707]: W0129 03:42:56.452612 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706ea7e5_d8b2_4bc1_900b_d62dddcad89e.slice/crio-320784b43f3c0c7fc455e0404916a3e66c80c1ce7386eafbe0e6ad71d12e590e WatchSource:0}: Error finding container 320784b43f3c0c7fc455e0404916a3e66c80c1ce7386eafbe0e6ad71d12e590e: Status 404 returned error can't find the container with id 320784b43f3c0c7fc455e0404916a3e66c80c1ce7386eafbe0e6ad71d12e590e Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.459115 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/watcher-operator@sha256:35f1eb96f42069bb8f7c33942fb86b41843ba02803464245c16192ccda3d50e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wlrn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-767b8bc766-mhnm2_openstack-operators(706ea7e5-d8b2-4bc1-900b-d62dddcad89e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.461115 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" podUID="706ea7e5-d8b2-4bc1-900b-d62dddcad89e" Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.467335 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/octavia-operator@sha256:c7804813a3bba8910a47a5f32bd528335e18397f93cf5f7e7181d3d2c209b59b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t8w2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5c765b4558-5jtr8_openstack-operators(26d2ace7-4405-480c-acf8-233e1511007f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.468513 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7nsx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-788c46999f-4cp2t_openstack-operators(e0372a1a-cd84-491e-a3d1-f58389a66b63): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.468578 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" podUID="26d2ace7-4405-480c-acf8-233e1511007f" Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.469902 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" podUID="e0372a1a-cd84-491e-a3d1-f58389a66b63" Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.479358 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt"] Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.500719 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj"] Jan 29 03:42:56 crc kubenswrapper[4707]: W0129 03:42:56.518923 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf57d529d_1352_47d9_baa8_a2f383374b35.slice/crio-8bcac68bbe9dd7e88311d8b04b2de74bae9bfbf752a57d16f51e93f909ba8754 WatchSource:0}: Error finding container 8bcac68bbe9dd7e88311d8b04b2de74bae9bfbf752a57d16f51e93f909ba8754: Status 404 returned error can't find the container with id 8bcac68bbe9dd7e88311d8b04b2de74bae9bfbf752a57d16f51e93f909ba8754 Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.526029 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nn2tb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-wblrt_openstack-operators(f57d529d-1352-47d9-baa8-a2f383374b35): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.527980 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" podUID="f57d529d-1352-47d9-baa8-a2f383374b35" Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.535848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.536040 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.536095 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert podName:a6df0676-63de-4a83-bc60-9b69a2f8777f nodeName:}" failed. No retries permitted until 2026-01-29 03:42:58.536077681 +0000 UTC m=+932.020306586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" (UID: "a6df0676-63de-4a83-bc60-9b69a2f8777f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.952181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:56 crc kubenswrapper[4707]: I0129 03:42:56.958731 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.952438 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.959167 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:42:58.959147209 +0000 UTC m=+932.443376104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "webhook-server-cert" not found Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.959092 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 03:42:56 crc kubenswrapper[4707]: E0129 03:42:56.960241 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:42:58.960227809 +0000 UTC m=+932.444456714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "metrics-server-cert" not found Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.182981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" event={"ID":"8255e85b-8815-4860-9325-7570ba9a6fd9","Type":"ContainerStarted","Data":"7de92e9c3e529d512984e3bdcbdfe91da86e5263544586340704cc19f24cda15"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.184906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" event={"ID":"e0372a1a-cd84-491e-a3d1-f58389a66b63","Type":"ContainerStarted","Data":"94cada76aea79000ace9f7c99880abc1b8d8c7dd0227f26f3f9a8e7fae614d12"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.188337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" event={"ID":"0a32b73c-f66f-425f-81a9-ef1cc36041d4","Type":"ContainerStarted","Data":"4ae9102310200b4f4a73314d8b38399354b67dc7484a52e249a542440cbca4a6"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.190926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" event={"ID":"6fb86866-7c9d-4b4f-bf81-8a36898aca3d","Type":"ContainerStarted","Data":"90eb86d2f929c5f4edc23669f6b07190a5de23408dbba768b01485e293891638"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.192425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" event={"ID":"59a9dc92-c9db-4bfa-8233-88b1690beaad","Type":"ContainerStarted","Data":"6a5e2665233da23150c44a4ffbb9fd956b99b4e9915a3ef9939645d8318ee739"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.196638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" event={"ID":"26d2ace7-4405-480c-acf8-233e1511007f","Type":"ContainerStarted","Data":"f52c90af8d7b33238436cc72885878242466fbac8a3b2f70418b112809bec843"} Jan 29 03:42:57 crc kubenswrapper[4707]: E0129 03:42:57.197742 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" podUID="e0372a1a-cd84-491e-a3d1-f58389a66b63" Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.200958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" event={"ID":"b7b5c12b-680b-4814-906c-62c9f8702559","Type":"ContainerStarted","Data":"5c0df191f7913515c8c2e6d58fd1b22472416a3ffc56fa937898317ea4ac8208"} Jan 29 03:42:57 crc kubenswrapper[4707]: E0129 03:42:57.202405 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/octavia-operator@sha256:c7804813a3bba8910a47a5f32bd528335e18397f93cf5f7e7181d3d2c209b59b\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" podUID="26d2ace7-4405-480c-acf8-233e1511007f" Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.241940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" event={"ID":"f57d529d-1352-47d9-baa8-a2f383374b35","Type":"ContainerStarted","Data":"8bcac68bbe9dd7e88311d8b04b2de74bae9bfbf752a57d16f51e93f909ba8754"} Jan 29 03:42:57 crc kubenswrapper[4707]: E0129 03:42:57.252775 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" podUID="f57d529d-1352-47d9-baa8-a2f383374b35" Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.266242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" event={"ID":"706ea7e5-d8b2-4bc1-900b-d62dddcad89e","Type":"ContainerStarted","Data":"320784b43f3c0c7fc455e0404916a3e66c80c1ce7386eafbe0e6ad71d12e590e"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.266293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" event={"ID":"7d2c1f08-0b63-4368-a7cc-9374d0dbf035","Type":"ContainerStarted","Data":"ff48713b9a911b82b46ade401bdcb3c3581ac8629f2322043279ea078eeaf5b6"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.273094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" event={"ID":"02f283f2-5bf1-4ee7-ac34-751ffc96421c","Type":"ContainerStarted","Data":"9b2fd7fd0052a271bad16489d9da55e7182ec34352f5ec9d1ca600688c5111b9"} Jan 29 03:42:57 crc kubenswrapper[4707]: E0129 03:42:57.275688 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/watcher-operator@sha256:35f1eb96f42069bb8f7c33942fb86b41843ba02803464245c16192ccda3d50e4\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" podUID="706ea7e5-d8b2-4bc1-900b-d62dddcad89e" Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.277098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" event={"ID":"e15cd320-f902-4d99-8037-5c9355f4a833","Type":"ContainerStarted","Data":"eb528fc99c5826c5b02bb87498def44285d6410413ce213362f69ee47b8f7eea"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.295728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" event={"ID":"6e5e159a-c89d-43cf-b9cf-4a92de09ac22","Type":"ContainerStarted","Data":"37f336e245aaf0d9f2095537c341de8bc8b50a9dd3269cc64dd91aeb962bcb85"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.309949 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" event={"ID":"6fa97c9f-4b04-4795-9f11-9790c692ba0f","Type":"ContainerStarted","Data":"959f50d91c31adf2dcc839dfd5382ffeb6164ebbb3af5c9d73b6999e38faa352"} Jan 29 03:42:57 crc kubenswrapper[4707]: I0129 03:42:57.329780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" event={"ID":"55320b88-8f86-47bd-8718-6cabd0865a1c","Type":"ContainerStarted","Data":"d5aa7761533b4ccfb50d7c778ced8bb50fe547b23cc88eca2c6dc0462f0668ac"} Jan 29 03:42:58 crc kubenswrapper[4707]: I0129 03:42:58.187625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:42:58 crc kubenswrapper[4707]: E0129 03:42:58.187987 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 03:42:58 crc kubenswrapper[4707]: E0129 03:42:58.188110 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert podName:09633ead-78c6-4934-95c2-05b24c6fc3e5 nodeName:}" failed. No retries permitted until 2026-01-29 03:43:02.188082347 +0000 UTC m=+935.672311252 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert") pod "infra-operator-controller-manager-79955696d6-4wck9" (UID: "09633ead-78c6-4934-95c2-05b24c6fc3e5") : secret "infra-operator-webhook-server-cert" not found Jan 29 03:42:58 crc kubenswrapper[4707]: E0129 03:42:58.363774 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/octavia-operator@sha256:c7804813a3bba8910a47a5f32bd528335e18397f93cf5f7e7181d3d2c209b59b\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" podUID="26d2ace7-4405-480c-acf8-233e1511007f" Jan 29 03:42:58 crc kubenswrapper[4707]: E0129 03:42:58.363866 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" podUID="e0372a1a-cd84-491e-a3d1-f58389a66b63" Jan 29 03:42:58 crc kubenswrapper[4707]: E0129 03:42:58.364080 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" podUID="f57d529d-1352-47d9-baa8-a2f383374b35" Jan 29 03:42:58 crc kubenswrapper[4707]: E0129 03:42:58.367924 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/watcher-operator@sha256:35f1eb96f42069bb8f7c33942fb86b41843ba02803464245c16192ccda3d50e4\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" podUID="706ea7e5-d8b2-4bc1-900b-d62dddcad89e" Jan 29 03:42:58 crc kubenswrapper[4707]: I0129 03:42:58.598456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:42:58 crc kubenswrapper[4707]: E0129 03:42:58.598741 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:42:58 crc kubenswrapper[4707]: E0129 03:42:58.599402 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert podName:a6df0676-63de-4a83-bc60-9b69a2f8777f nodeName:}" failed. No retries permitted until 2026-01-29 03:43:02.599379246 +0000 UTC m=+936.083608151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" (UID: "a6df0676-63de-4a83-bc60-9b69a2f8777f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:42:59 crc kubenswrapper[4707]: I0129 03:42:59.007681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:59 crc kubenswrapper[4707]: I0129 03:42:59.007758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:42:59 crc kubenswrapper[4707]: E0129 03:42:59.007913 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 03:42:59 crc kubenswrapper[4707]: E0129 03:42:59.007965 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 03:42:59 crc kubenswrapper[4707]: E0129 03:42:59.008026 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:43:03.00799386 +0000 UTC m=+936.492222765 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "webhook-server-cert" not found Jan 29 03:42:59 crc kubenswrapper[4707]: E0129 03:42:59.008068 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:43:03.008046281 +0000 UTC m=+936.492275196 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "metrics-server-cert" not found Jan 29 03:43:02 crc kubenswrapper[4707]: I0129 03:43:02.286116 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:43:02 crc kubenswrapper[4707]: E0129 03:43:02.290868 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 03:43:02 crc kubenswrapper[4707]: E0129 03:43:02.291131 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert podName:09633ead-78c6-4934-95c2-05b24c6fc3e5 nodeName:}" failed. No retries permitted until 2026-01-29 03:43:10.291100056 +0000 UTC m=+943.775328961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert") pod "infra-operator-controller-manager-79955696d6-4wck9" (UID: "09633ead-78c6-4934-95c2-05b24c6fc3e5") : secret "infra-operator-webhook-server-cert" not found Jan 29 03:43:02 crc kubenswrapper[4707]: I0129 03:43:02.693082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:43:02 crc kubenswrapper[4707]: E0129 03:43:02.693416 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:43:02 crc kubenswrapper[4707]: E0129 03:43:02.693587 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert podName:a6df0676-63de-4a83-bc60-9b69a2f8777f nodeName:}" failed. No retries permitted until 2026-01-29 03:43:10.693522277 +0000 UTC m=+944.177751252 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" (UID: "a6df0676-63de-4a83-bc60-9b69a2f8777f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:43:03 crc kubenswrapper[4707]: I0129 03:43:03.097812 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:03 crc kubenswrapper[4707]: I0129 03:43:03.097891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:03 crc kubenswrapper[4707]: E0129 03:43:03.098044 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 03:43:03 crc kubenswrapper[4707]: E0129 03:43:03.098125 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:43:11.098102438 +0000 UTC m=+944.582331343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "metrics-server-cert" not found Jan 29 03:43:03 crc kubenswrapper[4707]: E0129 03:43:03.098044 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 03:43:03 crc kubenswrapper[4707]: E0129 03:43:03.098194 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:43:11.09817917 +0000 UTC m=+944.582408075 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "webhook-server-cert" not found Jan 29 03:43:03 crc kubenswrapper[4707]: I0129 03:43:03.463193 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:43:03 crc kubenswrapper[4707]: I0129 03:43:03.463274 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:43:03 crc kubenswrapper[4707]: I0129 03:43:03.463330 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:43:03 crc kubenswrapper[4707]: I0129 03:43:03.464101 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db143d5776abf0f9e4af062dd3ffe22d1bdacd65eb8ea86d4728ea8e0ca0f327"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 03:43:03 crc kubenswrapper[4707]: I0129 03:43:03.464163 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://db143d5776abf0f9e4af062dd3ffe22d1bdacd65eb8ea86d4728ea8e0ca0f327" gracePeriod=600 Jan 29 03:43:04 crc kubenswrapper[4707]: I0129 03:43:04.410802 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="db143d5776abf0f9e4af062dd3ffe22d1bdacd65eb8ea86d4728ea8e0ca0f327" exitCode=0 Jan 29 03:43:04 crc kubenswrapper[4707]: I0129 03:43:04.410863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"db143d5776abf0f9e4af062dd3ffe22d1bdacd65eb8ea86d4728ea8e0ca0f327"} Jan 29 03:43:04 crc kubenswrapper[4707]: I0129 03:43:04.410948 4707 scope.go:117] "RemoveContainer" containerID="fdbfa93f1cdcabdbaa105826dff7f462fdab9740abd38ddc05a6f8c6801cf011" Jan 29 03:43:08 crc kubenswrapper[4707]: E0129 03:43:08.206815 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/designate-operator@sha256:29a3092217e72f1ec8a163ed3d15a0a5ccc5b3117e64c72bf5e68597cc233b3d" Jan 29 03:43:08 crc kubenswrapper[4707]: E0129 03:43:08.207623 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/designate-operator@sha256:29a3092217e72f1ec8a163ed3d15a0a5ccc5b3117e64c72bf5e68597cc233b3d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rpk49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66dfbd6f5d-jlsmf_openstack-operators(f064b8fa-dd53-4fd8-8440-9e517b1c1279): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:08 crc kubenswrapper[4707]: E0129 03:43:08.208968 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" podUID="f064b8fa-dd53-4fd8-8440-9e517b1c1279" Jan 29 03:43:08 crc kubenswrapper[4707]: E0129 03:43:08.450287 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/designate-operator@sha256:29a3092217e72f1ec8a163ed3d15a0a5ccc5b3117e64c72bf5e68597cc233b3d\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" podUID="f064b8fa-dd53-4fd8-8440-9e517b1c1279" Jan 29 03:43:08 crc kubenswrapper[4707]: E0129 03:43:08.975931 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/cinder-operator@sha256:6da7ec7bf701fe1dd489852a16429f163a69073fae67b872dca4b080cc3514ad" Jan 29 03:43:08 crc kubenswrapper[4707]: E0129 03:43:08.976185 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/cinder-operator@sha256:6da7ec7bf701fe1dd489852a16429f163a69073fae67b872dca4b080cc3514ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9f9tr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-f6487bd57-qbdg4_openstack-operators(155a3715-4600-4f83-8db3-a6beaf5c3394): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:08 crc kubenswrapper[4707]: E0129 03:43:08.977454 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" podUID="155a3715-4600-4f83-8db3-a6beaf5c3394" Jan 29 03:43:09 crc kubenswrapper[4707]: E0129 03:43:09.467288 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/cinder-operator@sha256:6da7ec7bf701fe1dd489852a16429f163a69073fae67b872dca4b080cc3514ad\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" podUID="155a3715-4600-4f83-8db3-a6beaf5c3394" Jan 29 03:43:09 crc kubenswrapper[4707]: E0129 03:43:09.566872 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/neutron-operator@sha256:22665b40ffeef62d1a612c1f9f0fa8e97ff95085fad123895d786b770f421fc0" Jan 29 03:43:09 crc kubenswrapper[4707]: E0129 03:43:09.567106 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/neutron-operator@sha256:22665b40ffeef62d1a612c1f9f0fa8e97ff95085fad123895d786b770f421fc0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9wmpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-694c5bfc85-g6dzc_openstack-operators(7d2c1f08-0b63-4368-a7cc-9374d0dbf035): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:09 crc kubenswrapper[4707]: E0129 03:43:09.568335 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" podUID="7d2c1f08-0b63-4368-a7cc-9374d0dbf035" Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.104819 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/heat-operator@sha256:429171b44a24e9e4dde46465d90a272d93b15317ea386184d6ad077cc119d3c9" Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.105513 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/heat-operator@sha256:429171b44a24e9e4dde46465d90a272d93b15317ea386184d6ad077cc119d3c9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hbncl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-587c6bfdcf-c8v2v_openstack-operators(de21d951-1d0b-415e-8923-5fa2cc58e439): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.106781 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" podUID="de21d951-1d0b-415e-8923-5fa2cc58e439" Jan 29 03:43:10 crc kubenswrapper[4707]: I0129 03:43:10.337611 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.338253 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.338348 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert podName:09633ead-78c6-4934-95c2-05b24c6fc3e5 nodeName:}" failed. No retries permitted until 2026-01-29 03:43:26.338324296 +0000 UTC m=+959.822553201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert") pod "infra-operator-controller-manager-79955696d6-4wck9" (UID: "09633ead-78c6-4934-95c2-05b24c6fc3e5") : secret "infra-operator-webhook-server-cert" not found Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.475893 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/neutron-operator@sha256:22665b40ffeef62d1a612c1f9f0fa8e97ff95085fad123895d786b770f421fc0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" podUID="7d2c1f08-0b63-4368-a7cc-9374d0dbf035" Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.477468 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/heat-operator@sha256:429171b44a24e9e4dde46465d90a272d93b15317ea386184d6ad077cc119d3c9\\\"\"" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" podUID="de21d951-1d0b-415e-8923-5fa2cc58e439" Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.727435 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382" Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.727797 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6qlb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-zmvww_openstack-operators(b7b5c12b-680b-4814-906c-62c9f8702559): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.729140 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" podUID="b7b5c12b-680b-4814-906c-62c9f8702559" Jan 29 03:43:10 crc kubenswrapper[4707]: I0129 03:43:10.743250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.743728 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:43:10 crc kubenswrapper[4707]: E0129 03:43:10.743842 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert podName:a6df0676-63de-4a83-bc60-9b69a2f8777f nodeName:}" failed. No retries permitted until 2026-01-29 03:43:26.743815493 +0000 UTC m=+960.228044408 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" (UID: "a6df0676-63de-4a83-bc60-9b69a2f8777f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 03:43:11 crc kubenswrapper[4707]: I0129 03:43:11.150700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:11 crc kubenswrapper[4707]: I0129 03:43:11.150775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.150994 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.151094 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.151127 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:43:27.151096659 +0000 UTC m=+960.635325564 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "webhook-server-cert" not found Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.151219 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs podName:d938abde-b4d6-4d4e-a176-9ed92ac5325d nodeName:}" failed. No retries permitted until 2026-01-29 03:43:27.151191632 +0000 UTC m=+960.635420597 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs") pod "openstack-operator-controller-manager-cc96c49b6-x4zwn" (UID: "d938abde-b4d6-4d4e-a176-9ed92ac5325d") : secret "metrics-server-cert" not found Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.314253 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/manila-operator@sha256:2e1a77365c3b08ff39892565abfc72b72e969f623e58a2663fb93890371fc9da" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.314514 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/manila-operator@sha256:2e1a77365c3b08ff39892565abfc72b72e969f623e58a2663fb93890371fc9da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dcvqw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-765668569f-jq2z4_openstack-operators(02f283f2-5bf1-4ee7-ac34-751ffc96421c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.317268 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" podUID="02f283f2-5bf1-4ee7-ac34-751ffc96421c" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.435605 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:6c13b4fe51339271afe9389aeb9fc1af07ec3c81" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.438320 4707 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:6c13b4fe51339271afe9389aeb9fc1af07ec3c81" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.438734 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:6c13b4fe51339271afe9389aeb9fc1af07ec3c81,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fcd8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7886d5cc69-w8rzq_openstack-operators(0a32b73c-f66f-425f-81a9-ef1cc36041d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.441501 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" podUID="0a32b73c-f66f-425f-81a9-ef1cc36041d4" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.482019 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" podUID="b7b5c12b-680b-4814-906c-62c9f8702559" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.482821 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:6c13b4fe51339271afe9389aeb9fc1af07ec3c81\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" podUID="0a32b73c-f66f-425f-81a9-ef1cc36041d4" Jan 29 03:43:11 crc kubenswrapper[4707]: E0129 03:43:11.483191 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/manila-operator@sha256:2e1a77365c3b08ff39892565abfc72b72e969f623e58a2663fb93890371fc9da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" podUID="02f283f2-5bf1-4ee7-ac34-751ffc96421c" Jan 29 03:43:12 crc kubenswrapper[4707]: E0129 03:43:12.164421 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 29 03:43:12 crc kubenswrapper[4707]: E0129 03:43:12.164721 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9lb6z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-b6ggj_openstack-operators(6fb86866-7c9d-4b4f-bf81-8a36898aca3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:12 crc kubenswrapper[4707]: E0129 03:43:12.165954 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" podUID="6fb86866-7c9d-4b4f-bf81-8a36898aca3d" Jan 29 03:43:12 crc kubenswrapper[4707]: E0129 03:43:12.492645 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" podUID="6fb86866-7c9d-4b4f-bf81-8a36898aca3d" Jan 29 03:43:12 crc kubenswrapper[4707]: E0129 03:43:12.898500 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/keystone-operator@sha256:45ef0b95f941479535575b3d2cabb58a52e1d8490eed3da1bca9acd49344a722" Jan 29 03:43:12 crc kubenswrapper[4707]: E0129 03:43:12.898846 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/keystone-operator@sha256:45ef0b95f941479535575b3d2cabb58a52e1d8490eed3da1bca9acd49344a722,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b9b6z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-6978b79747-49nhf_openstack-operators(6fa97c9f-4b04-4795-9f11-9790c692ba0f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:12 crc kubenswrapper[4707]: E0129 03:43:12.904736 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" podUID="6fa97c9f-4b04-4795-9f11-9790c692ba0f" Jan 29 03:43:13 crc kubenswrapper[4707]: E0129 03:43:13.502621 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/keystone-operator@sha256:45ef0b95f941479535575b3d2cabb58a52e1d8490eed3da1bca9acd49344a722\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" podUID="6fa97c9f-4b04-4795-9f11-9790c692ba0f" Jan 29 03:43:14 crc kubenswrapper[4707]: I0129 03:43:14.510269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"b9348d06267b549d79524d7d6fb99695969175eb246c0104709c649f6ca1b571"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.524488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" event={"ID":"3445268f-15c8-4438-8fd1-a13d2bd9981d","Type":"ContainerStarted","Data":"a0bf3f6f8d86cf682ea05b71e5072f04d1caabf6f86033867b08bc171db2a86e"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.525127 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.526251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" event={"ID":"6e5e159a-c89d-43cf-b9cf-4a92de09ac22","Type":"ContainerStarted","Data":"11c12d54c56ed5781ae2b228b71fafd113297e5145e1d1fcc21feb0839e96902"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.526765 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.528166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" event={"ID":"e0372a1a-cd84-491e-a3d1-f58389a66b63","Type":"ContainerStarted","Data":"5700386763f7f62a3cab8c21bf5f3a1d479d22e9ee6f0539935870b4bf6cf9f8"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.528622 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.530575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" event={"ID":"55320b88-8f86-47bd-8718-6cabd0865a1c","Type":"ContainerStarted","Data":"88678f323c5115402e69fe002fe79327c5a43a4f17bc7c1dada036fcf17116a0"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.530778 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.531976 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" event={"ID":"6dbe27ba-a451-4202-8f58-73cb0684bfea","Type":"ContainerStarted","Data":"c9573df9d8e0faf431f290d4d1bd46f2766f58b7107adc2c2c9938ee7ed5d5a2"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.532138 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.534121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" event={"ID":"e15cd320-f902-4d99-8037-5c9355f4a833","Type":"ContainerStarted","Data":"51bfb45bcc1d3dcb83c480f06c76252423e9e9de524f77702ef28d47cdfed8dc"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.534242 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.535726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" event={"ID":"f57d529d-1352-47d9-baa8-a2f383374b35","Type":"ContainerStarted","Data":"eae0089e0a2623523871084adb791586e7532547bc4b8759a2db204d30bfeb96"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.535918 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.537396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" event={"ID":"8255e85b-8815-4860-9325-7570ba9a6fd9","Type":"ContainerStarted","Data":"c62303b3f04f22308c28aeec4a9c014454f3040e35ffd93fd2117b53e806b707"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.539171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" event={"ID":"59a9dc92-c9db-4bfa-8233-88b1690beaad","Type":"ContainerStarted","Data":"a31811db22c0cfae3f4d74a310d282e3e261019958818ebc5b8f920666202a73"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.539353 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.569642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" event={"ID":"26d2ace7-4405-480c-acf8-233e1511007f","Type":"ContainerStarted","Data":"42c7f5ebe64f82e4b25093a940551c1b3c87e0455c9091ac5ff69e8d8ceb58c7"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.570093 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.572849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" event={"ID":"706ea7e5-d8b2-4bc1-900b-d62dddcad89e","Type":"ContainerStarted","Data":"c37ab4461f30f5fe8b0289765d8cda8d4bcf3839367882a3b6d20735bf0bf6d1"} Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.573348 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.580376 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" podStartSLOduration=5.894427768 podStartE2EDuration="22.580344752s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:55.458188372 +0000 UTC m=+928.942417277" lastFinishedPulling="2026-01-29 03:43:12.144105356 +0000 UTC m=+945.628334261" observedRunningTime="2026-01-29 03:43:16.559818189 +0000 UTC m=+950.044047124" watchObservedRunningTime="2026-01-29 03:43:16.580344752 +0000 UTC m=+950.064573657" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.640181 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" podStartSLOduration=5.124956215 podStartE2EDuration="22.640154723s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.27471647 +0000 UTC m=+929.758945375" lastFinishedPulling="2026-01-29 03:43:13.789914978 +0000 UTC m=+947.274143883" observedRunningTime="2026-01-29 03:43:16.637987252 +0000 UTC m=+950.122216157" watchObservedRunningTime="2026-01-29 03:43:16.640154723 +0000 UTC m=+950.124383628" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.706186 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" podStartSLOduration=5.189382876 podStartE2EDuration="22.706162257s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.273068465 +0000 UTC m=+929.757297370" lastFinishedPulling="2026-01-29 03:43:13.789847846 +0000 UTC m=+947.274076751" observedRunningTime="2026-01-29 03:43:16.667871487 +0000 UTC m=+950.152100392" watchObservedRunningTime="2026-01-29 03:43:16.706162257 +0000 UTC m=+950.190391162" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.707197 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" podStartSLOduration=6.162213638 podStartE2EDuration="22.707191205s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.304622016 +0000 UTC m=+929.788850921" lastFinishedPulling="2026-01-29 03:43:12.849599583 +0000 UTC m=+946.333828488" observedRunningTime="2026-01-29 03:43:16.69912188 +0000 UTC m=+950.183350785" watchObservedRunningTime="2026-01-29 03:43:16.707191205 +0000 UTC m=+950.191420110" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.770910 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" podStartSLOduration=6.802623108 podStartE2EDuration="22.770886795s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.17587988 +0000 UTC m=+929.660108785" lastFinishedPulling="2026-01-29 03:43:12.144143567 +0000 UTC m=+945.628372472" observedRunningTime="2026-01-29 03:43:16.767610183 +0000 UTC m=+950.251839088" watchObservedRunningTime="2026-01-29 03:43:16.770886795 +0000 UTC m=+950.255115690" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.808205 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" podStartSLOduration=4.813230808 podStartE2EDuration="22.808176456s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:55.794898738 +0000 UTC m=+929.279127643" lastFinishedPulling="2026-01-29 03:43:13.789844376 +0000 UTC m=+947.274073291" observedRunningTime="2026-01-29 03:43:16.803889907 +0000 UTC m=+950.288118802" watchObservedRunningTime="2026-01-29 03:43:16.808176456 +0000 UTC m=+950.292405361" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.878129 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" podStartSLOduration=3.498186426 podStartE2EDuration="22.87810516s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.525849295 +0000 UTC m=+930.010078190" lastFinishedPulling="2026-01-29 03:43:15.905768019 +0000 UTC m=+949.389996924" observedRunningTime="2026-01-29 03:43:16.85342539 +0000 UTC m=+950.337654305" watchObservedRunningTime="2026-01-29 03:43:16.87810516 +0000 UTC m=+950.362334065" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.882249 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" podStartSLOduration=3.448173609 podStartE2EDuration="22.882237235s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.46837012 +0000 UTC m=+929.952599025" lastFinishedPulling="2026-01-29 03:43:15.902433746 +0000 UTC m=+949.386662651" observedRunningTime="2026-01-29 03:43:16.876596057 +0000 UTC m=+950.360824962" watchObservedRunningTime="2026-01-29 03:43:16.882237235 +0000 UTC m=+950.366466140" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.913600 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" podStartSLOduration=5.2976747490000005 podStartE2EDuration="22.91357416s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.173970056 +0000 UTC m=+929.658198961" lastFinishedPulling="2026-01-29 03:43:13.789869467 +0000 UTC m=+947.274098372" observedRunningTime="2026-01-29 03:43:16.908603311 +0000 UTC m=+950.392832216" watchObservedRunningTime="2026-01-29 03:43:16.91357416 +0000 UTC m=+950.397803065" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.934594 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" podStartSLOduration=3.487479747 podStartE2EDuration="22.934569677s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.458819253 +0000 UTC m=+929.943048158" lastFinishedPulling="2026-01-29 03:43:15.905909173 +0000 UTC m=+949.390138088" observedRunningTime="2026-01-29 03:43:16.929802404 +0000 UTC m=+950.414031309" watchObservedRunningTime="2026-01-29 03:43:16.934569677 +0000 UTC m=+950.418798582" Jan 29 03:43:16 crc kubenswrapper[4707]: I0129 03:43:16.954365 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" podStartSLOduration=4.3325359 podStartE2EDuration="22.954343459s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.467171106 +0000 UTC m=+929.951400011" lastFinishedPulling="2026-01-29 03:43:15.088978675 +0000 UTC m=+948.573207570" observedRunningTime="2026-01-29 03:43:16.953740452 +0000 UTC m=+950.437969347" watchObservedRunningTime="2026-01-29 03:43:16.954343459 +0000 UTC m=+950.438572364" Jan 29 03:43:17 crc kubenswrapper[4707]: I0129 03:43:17.584850 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" Jan 29 03:43:22 crc kubenswrapper[4707]: I0129 03:43:22.622119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" event={"ID":"155a3715-4600-4f83-8db3-a6beaf5c3394","Type":"ContainerStarted","Data":"4fb328558257573a187c9fce5361ac0833addb7525f5acdb36e67b4a2b877a42"} Jan 29 03:43:22 crc kubenswrapper[4707]: I0129 03:43:22.622851 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" Jan 29 03:43:23 crc kubenswrapper[4707]: I0129 03:43:23.269760 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" podStartSLOduration=3.278807736 podStartE2EDuration="29.269735135s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:55.687030884 +0000 UTC m=+929.171259789" lastFinishedPulling="2026-01-29 03:43:21.677958283 +0000 UTC m=+955.162187188" observedRunningTime="2026-01-29 03:43:22.666503615 +0000 UTC m=+956.150732520" watchObservedRunningTime="2026-01-29 03:43:23.269735135 +0000 UTC m=+956.753964040" Jan 29 03:43:23 crc kubenswrapper[4707]: I0129 03:43:23.633075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" event={"ID":"0a32b73c-f66f-425f-81a9-ef1cc36041d4","Type":"ContainerStarted","Data":"5e59b240a8d86bf8b2845f7c035aee2d83a84f5cefd7f6b1b3938cdd29ed723e"} Jan 29 03:43:23 crc kubenswrapper[4707]: I0129 03:43:23.633488 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" Jan 29 03:43:23 crc kubenswrapper[4707]: I0129 03:43:23.651476 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" podStartSLOduration=2.56112579 podStartE2EDuration="29.651443327s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.241811361 +0000 UTC m=+929.726040266" lastFinishedPulling="2026-01-29 03:43:23.332128898 +0000 UTC m=+956.816357803" observedRunningTime="2026-01-29 03:43:23.650976184 +0000 UTC m=+957.135205099" watchObservedRunningTime="2026-01-29 03:43:23.651443327 +0000 UTC m=+957.135672272" Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.455143 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-6db5dbd896-c2sgx" Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.642168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" event={"ID":"02f283f2-5bf1-4ee7-ac34-751ffc96421c","Type":"ContainerStarted","Data":"a579a5723c5643c87eb0b617804889ff8a30254116e4eee99647afd699bffa7a"} Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.643463 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.644483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" event={"ID":"f064b8fa-dd53-4fd8-8440-9e517b1c1279","Type":"ContainerStarted","Data":"ba84b1c505c1358c87b41165bf36aad7ca894ea2200785c08b50f693aa54ebe8"} Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.644728 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.662254 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" podStartSLOduration=3.101537396 podStartE2EDuration="30.662237911s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.164467141 +0000 UTC m=+929.648696046" lastFinishedPulling="2026-01-29 03:43:23.725167656 +0000 UTC m=+957.209396561" observedRunningTime="2026-01-29 03:43:24.660253306 +0000 UTC m=+958.144482211" watchObservedRunningTime="2026-01-29 03:43:24.662237911 +0000 UTC m=+958.146466816" Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.686197 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" podStartSLOduration=2.258082526 podStartE2EDuration="30.68617988s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:55.294746037 +0000 UTC m=+928.778974942" lastFinishedPulling="2026-01-29 03:43:23.722843391 +0000 UTC m=+957.207072296" observedRunningTime="2026-01-29 03:43:24.683063423 +0000 UTC m=+958.167292328" watchObservedRunningTime="2026-01-29 03:43:24.68617988 +0000 UTC m=+958.170408785" Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.717682 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6bc7f4f4cf-9xlqh" Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.883384 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-z2vvp" Jan 29 03:43:24 crc kubenswrapper[4707]: I0129 03:43:24.890445 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-958664b5-qgj46" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.089693 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-jvdtd" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.107585 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-mrfw2" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.168618 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5c765b4558-5jtr8" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.274164 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-4cp2t" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.309736 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wblrt" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.488768 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-lnfzj" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.537304 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-767b8bc766-mhnm2" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.656444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" event={"ID":"de21d951-1d0b-415e-8923-5fa2cc58e439","Type":"ContainerStarted","Data":"d7c365f7d0f7a6e94959ad8a496fc16d06b6fc51e3e676730a95465f437ae4fa"} Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.658082 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.661594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" event={"ID":"b7b5c12b-680b-4814-906c-62c9f8702559","Type":"ContainerStarted","Data":"abb78043bad4fbac790c272862caecd79a5620e721c5369d654d027023e3b580"} Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.662028 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.663879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" event={"ID":"7d2c1f08-0b63-4368-a7cc-9374d0dbf035","Type":"ContainerStarted","Data":"2252887e57d40bcc410d31e599a71e7188ecedecae276dcbcd444352e064974e"} Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.665169 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.677089 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" podStartSLOduration=2.925077066 podStartE2EDuration="31.677065878s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:55.948673253 +0000 UTC m=+929.432902158" lastFinishedPulling="2026-01-29 03:43:24.700662065 +0000 UTC m=+958.184890970" observedRunningTime="2026-01-29 03:43:25.675659499 +0000 UTC m=+959.159888404" watchObservedRunningTime="2026-01-29 03:43:25.677065878 +0000 UTC m=+959.161294783" Jan 29 03:43:25 crc kubenswrapper[4707]: I0129 03:43:25.702044 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" podStartSLOduration=3.26171521 podStartE2EDuration="31.701983384s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.260339429 +0000 UTC m=+929.744568334" lastFinishedPulling="2026-01-29 03:43:24.700607583 +0000 UTC m=+958.184836508" observedRunningTime="2026-01-29 03:43:25.696848831 +0000 UTC m=+959.181077736" watchObservedRunningTime="2026-01-29 03:43:25.701983384 +0000 UTC m=+959.186212289" Jan 29 03:43:26 crc kubenswrapper[4707]: I0129 03:43:26.269594 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" podStartSLOduration=3.742818728 podStartE2EDuration="32.269569408s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.307775534 +0000 UTC m=+929.792004439" lastFinishedPulling="2026-01-29 03:43:24.834526224 +0000 UTC m=+958.318755119" observedRunningTime="2026-01-29 03:43:25.721314754 +0000 UTC m=+959.205543659" watchObservedRunningTime="2026-01-29 03:43:26.269569408 +0000 UTC m=+959.753798313" Jan 29 03:43:26 crc kubenswrapper[4707]: I0129 03:43:26.419652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:43:26 crc kubenswrapper[4707]: I0129 03:43:26.438207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09633ead-78c6-4934-95c2-05b24c6fc3e5-cert\") pod \"infra-operator-controller-manager-79955696d6-4wck9\" (UID: \"09633ead-78c6-4934-95c2-05b24c6fc3e5\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:43:26 crc kubenswrapper[4707]: I0129 03:43:26.671115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" event={"ID":"6fb86866-7c9d-4b4f-bf81-8a36898aca3d","Type":"ContainerStarted","Data":"f9cfd81fbf65bcb8b478dacdd56ebc1ec61b216752ead13fdedb047faecb2433"} Jan 29 03:43:26 crc kubenswrapper[4707]: I0129 03:43:26.695601 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-b6ggj" podStartSLOduration=2.482787214 podStartE2EDuration="31.695576397s" podCreationTimestamp="2026-01-29 03:42:55 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.49487755 +0000 UTC m=+929.979106455" lastFinishedPulling="2026-01-29 03:43:25.707666733 +0000 UTC m=+959.191895638" observedRunningTime="2026-01-29 03:43:26.688407377 +0000 UTC m=+960.172636282" watchObservedRunningTime="2026-01-29 03:43:26.695576397 +0000 UTC m=+960.179805312" Jan 29 03:43:26 crc kubenswrapper[4707]: I0129 03:43:26.715232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:43:26 crc kubenswrapper[4707]: I0129 03:43:26.827664 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:43:26 crc kubenswrapper[4707]: I0129 03:43:26.854639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6df0676-63de-4a83-bc60-9b69a2f8777f-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv\" (UID: \"a6df0676-63de-4a83-bc60-9b69a2f8777f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.015394 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.188219 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tmw5z"] Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.190909 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.203840 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tmw5z"] Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.233682 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.233834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.240020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-webhook-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.241912 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-4wck9"] Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.246111 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d938abde-b4d6-4d4e-a176-9ed92ac5325d-metrics-certs\") pod \"openstack-operator-controller-manager-cc96c49b6-x4zwn\" (UID: \"d938abde-b4d6-4d4e-a176-9ed92ac5325d\") " pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.335510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-catalog-content\") pod \"certified-operators-tmw5z\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.335612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngqxn\" (UniqueName: \"kubernetes.io/projected/30e40a99-55e8-46f8-98e8-2555f77a40e3-kube-api-access-ngqxn\") pod \"certified-operators-tmw5z\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.335670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-utilities\") pod \"certified-operators-tmw5z\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.359517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-s45xq" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.367906 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.436640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-utilities\") pod \"certified-operators-tmw5z\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.436726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-catalog-content\") pod \"certified-operators-tmw5z\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.436769 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngqxn\" (UniqueName: \"kubernetes.io/projected/30e40a99-55e8-46f8-98e8-2555f77a40e3-kube-api-access-ngqxn\") pod \"certified-operators-tmw5z\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.437475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-catalog-content\") pod \"certified-operators-tmw5z\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.437617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-utilities\") pod \"certified-operators-tmw5z\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.471630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngqxn\" (UniqueName: \"kubernetes.io/projected/30e40a99-55e8-46f8-98e8-2555f77a40e3-kube-api-access-ngqxn\") pod \"certified-operators-tmw5z\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.507772 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv"] Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.529596 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.713934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" event={"ID":"09633ead-78c6-4934-95c2-05b24c6fc3e5","Type":"ContainerStarted","Data":"8dc1be56c5de98677e435665bda980f3fddfc5ea4590194b2e59af074caaf1a6"} Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.736827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" event={"ID":"a6df0676-63de-4a83-bc60-9b69a2f8777f","Type":"ContainerStarted","Data":"14de94b5d5ed8ff9fffc9993dbd7b943a6cddba921d5be4f6ec12ff98137b34a"} Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.745199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" event={"ID":"6fa97c9f-4b04-4795-9f11-9790c692ba0f","Type":"ContainerStarted","Data":"44c490764b369217e186620ff9b8e152437dbd6fa85d8051a51a2a6461d5053a"} Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.747670 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.771663 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" podStartSLOduration=3.096753422 podStartE2EDuration="33.771640435s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:42:56.144217585 +0000 UTC m=+929.628446490" lastFinishedPulling="2026-01-29 03:43:26.819104598 +0000 UTC m=+960.303333503" observedRunningTime="2026-01-29 03:43:27.770460992 +0000 UTC m=+961.254689897" watchObservedRunningTime="2026-01-29 03:43:27.771640435 +0000 UTC m=+961.255869340" Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.887121 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn"] Jan 29 03:43:27 crc kubenswrapper[4707]: W0129 03:43:27.896276 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd938abde_b4d6_4d4e_a176_9ed92ac5325d.slice/crio-7493f4b10b96612990af14a2ec279b9af93073820c347ca38946727bbcaa1b74 WatchSource:0}: Error finding container 7493f4b10b96612990af14a2ec279b9af93073820c347ca38946727bbcaa1b74: Status 404 returned error can't find the container with id 7493f4b10b96612990af14a2ec279b9af93073820c347ca38946727bbcaa1b74 Jan 29 03:43:27 crc kubenswrapper[4707]: I0129 03:43:27.924662 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tmw5z"] Jan 29 03:43:28 crc kubenswrapper[4707]: I0129 03:43:28.755424 4707 generic.go:334] "Generic (PLEG): container finished" podID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerID="56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb" exitCode=0 Jan 29 03:43:28 crc kubenswrapper[4707]: I0129 03:43:28.755501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmw5z" event={"ID":"30e40a99-55e8-46f8-98e8-2555f77a40e3","Type":"ContainerDied","Data":"56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb"} Jan 29 03:43:28 crc kubenswrapper[4707]: I0129 03:43:28.755570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmw5z" event={"ID":"30e40a99-55e8-46f8-98e8-2555f77a40e3","Type":"ContainerStarted","Data":"066c3c709c3b68f33b7b0e2330e6eae83743f02482204d0ae1695dbbfde341b4"} Jan 29 03:43:28 crc kubenswrapper[4707]: I0129 03:43:28.759347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" event={"ID":"d938abde-b4d6-4d4e-a176-9ed92ac5325d","Type":"ContainerStarted","Data":"be2fff01e6d0954ec7cfc35ec029883c048d001c1df4d9e93a87321cb3a28ae6"} Jan 29 03:43:28 crc kubenswrapper[4707]: I0129 03:43:28.759423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" event={"ID":"d938abde-b4d6-4d4e-a176-9ed92ac5325d","Type":"ContainerStarted","Data":"7493f4b10b96612990af14a2ec279b9af93073820c347ca38946727bbcaa1b74"} Jan 29 03:43:28 crc kubenswrapper[4707]: I0129 03:43:28.809523 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" podStartSLOduration=33.809503156 podStartE2EDuration="33.809503156s" podCreationTimestamp="2026-01-29 03:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:43:28.805052511 +0000 UTC m=+962.289281436" watchObservedRunningTime="2026-01-29 03:43:28.809503156 +0000 UTC m=+962.293732071" Jan 29 03:43:29 crc kubenswrapper[4707]: I0129 03:43:29.769564 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:34 crc kubenswrapper[4707]: I0129 03:43:34.411301 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-f6487bd57-qbdg4" Jan 29 03:43:34 crc kubenswrapper[4707]: I0129 03:43:34.428125 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66dfbd6f5d-jlsmf" Jan 29 03:43:34 crc kubenswrapper[4707]: I0129 03:43:34.800137 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-587c6bfdcf-c8v2v" Jan 29 03:43:34 crc kubenswrapper[4707]: I0129 03:43:34.900911 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6978b79747-49nhf" Jan 29 03:43:34 crc kubenswrapper[4707]: I0129 03:43:34.969936 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-765668569f-jq2z4" Jan 29 03:43:35 crc kubenswrapper[4707]: I0129 03:43:35.079909 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-694c5bfc85-g6dzc" Jan 29 03:43:35 crc kubenswrapper[4707]: I0129 03:43:35.150331 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-zmvww" Jan 29 03:43:35 crc kubenswrapper[4707]: I0129 03:43:35.166255 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7886d5cc69-w8rzq" Jan 29 03:43:37 crc kubenswrapper[4707]: I0129 03:43:37.375643 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-cc96c49b6-x4zwn" Jan 29 03:43:39 crc kubenswrapper[4707]: E0129 03:43:39.662111 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:a504ab83288310bbd8e39f3a01faaa3c210a14d94bbd32124e9eadd46227d6b3" Jan 29 03:43:39 crc kubenswrapper[4707]: E0129 03:43:39.662405 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:a504ab83288310bbd8e39f3a01faaa3c210a14d94bbd32124e9eadd46227d6b3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78gzw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-79955696d6-4wck9_openstack-operators(09633ead-78c6-4934-95c2-05b24c6fc3e5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:43:39 crc kubenswrapper[4707]: E0129 03:43:39.663826 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" podUID="09633ead-78c6-4934-95c2-05b24c6fc3e5" Jan 29 03:43:39 crc kubenswrapper[4707]: E0129 03:43:39.953836 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:a504ab83288310bbd8e39f3a01faaa3c210a14d94bbd32124e9eadd46227d6b3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" podUID="09633ead-78c6-4934-95c2-05b24c6fc3e5" Jan 29 03:43:40 crc kubenswrapper[4707]: I0129 03:43:40.964119 4707 generic.go:334] "Generic (PLEG): container finished" podID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerID="06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7" exitCode=0 Jan 29 03:43:40 crc kubenswrapper[4707]: I0129 03:43:40.964243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmw5z" event={"ID":"30e40a99-55e8-46f8-98e8-2555f77a40e3","Type":"ContainerDied","Data":"06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7"} Jan 29 03:43:40 crc kubenswrapper[4707]: I0129 03:43:40.967353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" event={"ID":"a6df0676-63de-4a83-bc60-9b69a2f8777f","Type":"ContainerStarted","Data":"7f34f0ab4911e67dc0257809431765481f18771220ab69cb1cf795c1dfb3caf7"} Jan 29 03:43:40 crc kubenswrapper[4707]: I0129 03:43:40.967470 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 03:43:40 crc kubenswrapper[4707]: I0129 03:43:40.967577 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:43:41 crc kubenswrapper[4707]: I0129 03:43:41.036287 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" podStartSLOduration=34.802952962 podStartE2EDuration="47.03625974s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:43:27.508384141 +0000 UTC m=+960.992613046" lastFinishedPulling="2026-01-29 03:43:39.741690879 +0000 UTC m=+973.225919824" observedRunningTime="2026-01-29 03:43:41.031359582 +0000 UTC m=+974.515588487" watchObservedRunningTime="2026-01-29 03:43:41.03625974 +0000 UTC m=+974.520488645" Jan 29 03:43:41 crc kubenswrapper[4707]: I0129 03:43:41.979116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmw5z" event={"ID":"30e40a99-55e8-46f8-98e8-2555f77a40e3","Type":"ContainerStarted","Data":"fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc"} Jan 29 03:43:42 crc kubenswrapper[4707]: I0129 03:43:42.006734 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tmw5z" podStartSLOduration=2.384648904 podStartE2EDuration="15.006702916s" podCreationTimestamp="2026-01-29 03:43:27 +0000 UTC" firstStartedPulling="2026-01-29 03:43:28.758283355 +0000 UTC m=+962.242512260" lastFinishedPulling="2026-01-29 03:43:41.380337357 +0000 UTC m=+974.864566272" observedRunningTime="2026-01-29 03:43:42.001383706 +0000 UTC m=+975.485612671" watchObservedRunningTime="2026-01-29 03:43:42.006702916 +0000 UTC m=+975.490931821" Jan 29 03:43:47 crc kubenswrapper[4707]: I0129 03:43:47.022773 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv" Jan 29 03:43:47 crc kubenswrapper[4707]: I0129 03:43:47.530724 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:47 crc kubenswrapper[4707]: I0129 03:43:47.531321 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:47 crc kubenswrapper[4707]: I0129 03:43:47.603052 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:48 crc kubenswrapper[4707]: I0129 03:43:48.099743 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:48 crc kubenswrapper[4707]: I0129 03:43:48.179363 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tmw5z"] Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.053337 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tmw5z" podUID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerName="registry-server" containerID="cri-o://fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc" gracePeriod=2 Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.441690 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.625847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngqxn\" (UniqueName: \"kubernetes.io/projected/30e40a99-55e8-46f8-98e8-2555f77a40e3-kube-api-access-ngqxn\") pod \"30e40a99-55e8-46f8-98e8-2555f77a40e3\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.625958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-utilities\") pod \"30e40a99-55e8-46f8-98e8-2555f77a40e3\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.626078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-catalog-content\") pod \"30e40a99-55e8-46f8-98e8-2555f77a40e3\" (UID: \"30e40a99-55e8-46f8-98e8-2555f77a40e3\") " Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.628048 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-utilities" (OuterVolumeSpecName: "utilities") pod "30e40a99-55e8-46f8-98e8-2555f77a40e3" (UID: "30e40a99-55e8-46f8-98e8-2555f77a40e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.639725 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e40a99-55e8-46f8-98e8-2555f77a40e3-kube-api-access-ngqxn" (OuterVolumeSpecName: "kube-api-access-ngqxn") pod "30e40a99-55e8-46f8-98e8-2555f77a40e3" (UID: "30e40a99-55e8-46f8-98e8-2555f77a40e3"). InnerVolumeSpecName "kube-api-access-ngqxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.728194 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngqxn\" (UniqueName: \"kubernetes.io/projected/30e40a99-55e8-46f8-98e8-2555f77a40e3-kube-api-access-ngqxn\") on node \"crc\" DevicePath \"\"" Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.728241 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.775098 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30e40a99-55e8-46f8-98e8-2555f77a40e3" (UID: "30e40a99-55e8-46f8-98e8-2555f77a40e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:43:50 crc kubenswrapper[4707]: I0129 03:43:50.829746 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e40a99-55e8-46f8-98e8-2555f77a40e3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.067364 4707 generic.go:334] "Generic (PLEG): container finished" podID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerID="fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc" exitCode=0 Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.067453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmw5z" event={"ID":"30e40a99-55e8-46f8-98e8-2555f77a40e3","Type":"ContainerDied","Data":"fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc"} Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.067584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmw5z" event={"ID":"30e40a99-55e8-46f8-98e8-2555f77a40e3","Type":"ContainerDied","Data":"066c3c709c3b68f33b7b0e2330e6eae83743f02482204d0ae1695dbbfde341b4"} Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.067503 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmw5z" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.067629 4707 scope.go:117] "RemoveContainer" containerID="fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.093853 4707 scope.go:117] "RemoveContainer" containerID="06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.127096 4707 scope.go:117] "RemoveContainer" containerID="56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.127637 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tmw5z"] Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.140502 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tmw5z"] Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.154899 4707 scope.go:117] "RemoveContainer" containerID="fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc" Jan 29 03:43:51 crc kubenswrapper[4707]: E0129 03:43:51.155643 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc\": container with ID starting with fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc not found: ID does not exist" containerID="fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.155701 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc"} err="failed to get container status \"fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc\": rpc error: code = NotFound desc = could not find container \"fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc\": container with ID starting with fdba9297085b057af3a6aaddbb27e0de2bc15451d8bbeaa71bbe2f51acd3e2dc not found: ID does not exist" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.155740 4707 scope.go:117] "RemoveContainer" containerID="06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7" Jan 29 03:43:51 crc kubenswrapper[4707]: E0129 03:43:51.156100 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7\": container with ID starting with 06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7 not found: ID does not exist" containerID="06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.156192 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7"} err="failed to get container status \"06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7\": rpc error: code = NotFound desc = could not find container \"06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7\": container with ID starting with 06aad1630abd8adf598352daa87d51ed0d84a4b89c63a95c2115bda62e62eeb7 not found: ID does not exist" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.156257 4707 scope.go:117] "RemoveContainer" containerID="56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb" Jan 29 03:43:51 crc kubenswrapper[4707]: E0129 03:43:51.156922 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb\": container with ID starting with 56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb not found: ID does not exist" containerID="56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.156993 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb"} err="failed to get container status \"56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb\": rpc error: code = NotFound desc = could not find container \"56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb\": container with ID starting with 56e6881adcf69fe1b27086d0451aa5a1e58924d603c8ff949410e31f4b00a1fb not found: ID does not exist" Jan 29 03:43:51 crc kubenswrapper[4707]: I0129 03:43:51.262092 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e40a99-55e8-46f8-98e8-2555f77a40e3" path="/var/lib/kubelet/pods/30e40a99-55e8-46f8-98e8-2555f77a40e3/volumes" Jan 29 03:43:55 crc kubenswrapper[4707]: I0129 03:43:55.108200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" event={"ID":"09633ead-78c6-4934-95c2-05b24c6fc3e5","Type":"ContainerStarted","Data":"12361c154409c043a59487434ae933c1d5a35f128a4c57619dbf651a2eb16e55"} Jan 29 03:43:55 crc kubenswrapper[4707]: I0129 03:43:55.108971 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:43:55 crc kubenswrapper[4707]: I0129 03:43:55.134372 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" podStartSLOduration=33.718226577 podStartE2EDuration="1m1.134347148s" podCreationTimestamp="2026-01-29 03:42:54 +0000 UTC" firstStartedPulling="2026-01-29 03:43:27.264484519 +0000 UTC m=+960.748713424" lastFinishedPulling="2026-01-29 03:43:54.68060508 +0000 UTC m=+988.164833995" observedRunningTime="2026-01-29 03:43:55.132900237 +0000 UTC m=+988.617129152" watchObservedRunningTime="2026-01-29 03:43:55.134347148 +0000 UTC m=+988.618576053" Jan 29 03:44:06 crc kubenswrapper[4707]: I0129 03:44:06.729703 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-4wck9" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.008810 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hdk7w"] Jan 29 03:44:21 crc kubenswrapper[4707]: E0129 03:44:21.011443 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerName="extract-content" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.011466 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerName="extract-content" Jan 29 03:44:21 crc kubenswrapper[4707]: E0129 03:44:21.011477 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerName="registry-server" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.011483 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerName="registry-server" Jan 29 03:44:21 crc kubenswrapper[4707]: E0129 03:44:21.011504 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerName="extract-utilities" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.011511 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerName="extract-utilities" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.011652 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e40a99-55e8-46f8-98e8-2555f77a40e3" containerName="registry-server" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.012568 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.021738 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.022258 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.022372 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.022460 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lnplf" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.022748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hdk7w"] Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.156789 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tz82g"] Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.158327 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.160996 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.163631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-config\") pod \"dnsmasq-dns-675f4bcbfc-hdk7w\" (UID: \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.163681 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm95v\" (UniqueName: \"kubernetes.io/projected/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-kube-api-access-fm95v\") pod \"dnsmasq-dns-675f4bcbfc-hdk7w\" (UID: \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.173184 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tz82g"] Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.265138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-config\") pod \"dnsmasq-dns-675f4bcbfc-hdk7w\" (UID: \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.265197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm95v\" (UniqueName: \"kubernetes.io/projected/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-kube-api-access-fm95v\") pod \"dnsmasq-dns-675f4bcbfc-hdk7w\" (UID: \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.265251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-config\") pod \"dnsmasq-dns-78dd6ddcc-tz82g\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.265302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tlf6\" (UniqueName: \"kubernetes.io/projected/b7e66215-b828-4c5a-9f56-3cb79797752d-kube-api-access-8tlf6\") pod \"dnsmasq-dns-78dd6ddcc-tz82g\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.265359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tz82g\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.266255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-config\") pod \"dnsmasq-dns-675f4bcbfc-hdk7w\" (UID: \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.286874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm95v\" (UniqueName: \"kubernetes.io/projected/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-kube-api-access-fm95v\") pod \"dnsmasq-dns-675f4bcbfc-hdk7w\" (UID: \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\") " pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.340024 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.366504 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tz82g\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.366604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-config\") pod \"dnsmasq-dns-78dd6ddcc-tz82g\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.366669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tlf6\" (UniqueName: \"kubernetes.io/projected/b7e66215-b828-4c5a-9f56-3cb79797752d-kube-api-access-8tlf6\") pod \"dnsmasq-dns-78dd6ddcc-tz82g\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.368036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tz82g\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.368042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-config\") pod \"dnsmasq-dns-78dd6ddcc-tz82g\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.399720 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tlf6\" (UniqueName: \"kubernetes.io/projected/b7e66215-b828-4c5a-9f56-3cb79797752d-kube-api-access-8tlf6\") pod \"dnsmasq-dns-78dd6ddcc-tz82g\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.478355 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.864255 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hdk7w"] Jan 29 03:44:21 crc kubenswrapper[4707]: I0129 03:44:21.947370 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tz82g"] Jan 29 03:44:21 crc kubenswrapper[4707]: W0129 03:44:21.955706 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7e66215_b828_4c5a_9f56_3cb79797752d.slice/crio-7c3fe86163ce0f3291291e23769de3782361b65bbeeece8726977a38be7d76ca WatchSource:0}: Error finding container 7c3fe86163ce0f3291291e23769de3782361b65bbeeece8726977a38be7d76ca: Status 404 returned error can't find the container with id 7c3fe86163ce0f3291291e23769de3782361b65bbeeece8726977a38be7d76ca Jan 29 03:44:22 crc kubenswrapper[4707]: I0129 03:44:22.358001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" event={"ID":"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98","Type":"ContainerStarted","Data":"1d15635e30ce4cf140082371db89e5f803f9ab51746b00098cc9c52954b13424"} Jan 29 03:44:22 crc kubenswrapper[4707]: I0129 03:44:22.361701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" event={"ID":"b7e66215-b828-4c5a-9f56-3cb79797752d","Type":"ContainerStarted","Data":"7c3fe86163ce0f3291291e23769de3782361b65bbeeece8726977a38be7d76ca"} Jan 29 03:44:23 crc kubenswrapper[4707]: I0129 03:44:23.826416 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hdk7w"] Jan 29 03:44:23 crc kubenswrapper[4707]: I0129 03:44:23.839038 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4wdj8"] Jan 29 03:44:23 crc kubenswrapper[4707]: I0129 03:44:23.840799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:23 crc kubenswrapper[4707]: I0129 03:44:23.845617 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4wdj8"] Jan 29 03:44:23 crc kubenswrapper[4707]: I0129 03:44:23.910472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52hhw\" (UniqueName: \"kubernetes.io/projected/804415ac-e69a-4ab4-b372-0743406324eb-kube-api-access-52hhw\") pod \"dnsmasq-dns-666b6646f7-4wdj8\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:23 crc kubenswrapper[4707]: I0129 03:44:23.910551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4wdj8\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:23 crc kubenswrapper[4707]: I0129 03:44:23.910657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-config\") pod \"dnsmasq-dns-666b6646f7-4wdj8\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.012191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-config\") pod \"dnsmasq-dns-666b6646f7-4wdj8\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.012271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52hhw\" (UniqueName: \"kubernetes.io/projected/804415ac-e69a-4ab4-b372-0743406324eb-kube-api-access-52hhw\") pod \"dnsmasq-dns-666b6646f7-4wdj8\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.012296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4wdj8\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.013240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-config\") pod \"dnsmasq-dns-666b6646f7-4wdj8\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.013604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-dns-svc\") pod \"dnsmasq-dns-666b6646f7-4wdj8\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.065467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52hhw\" (UniqueName: \"kubernetes.io/projected/804415ac-e69a-4ab4-b372-0743406324eb-kube-api-access-52hhw\") pod \"dnsmasq-dns-666b6646f7-4wdj8\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.090943 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tz82g"] Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.118029 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-86m88"] Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.119254 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.136143 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-86m88"] Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.177467 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.217031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzhjb\" (UniqueName: \"kubernetes.io/projected/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-kube-api-access-hzhjb\") pod \"dnsmasq-dns-57d769cc4f-86m88\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.217508 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-86m88\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.217553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-config\") pod \"dnsmasq-dns-57d769cc4f-86m88\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.320842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzhjb\" (UniqueName: \"kubernetes.io/projected/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-kube-api-access-hzhjb\") pod \"dnsmasq-dns-57d769cc4f-86m88\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.320892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-86m88\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.320934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-config\") pod \"dnsmasq-dns-57d769cc4f-86m88\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.322302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-config\") pod \"dnsmasq-dns-57d769cc4f-86m88\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.323150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-86m88\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.345214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzhjb\" (UniqueName: \"kubernetes.io/projected/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-kube-api-access-hzhjb\") pod \"dnsmasq-dns-57d769cc4f-86m88\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.448998 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.697987 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4wdj8"] Jan 29 03:44:24 crc kubenswrapper[4707]: W0129 03:44:24.926661 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5b75e3f_4913_45d7_81fa_91bb3c0fca95.slice/crio-1047c4798563adf2cd5fc246d16f0660cfec653f383974cab942ef3bab2310ab WatchSource:0}: Error finding container 1047c4798563adf2cd5fc246d16f0660cfec653f383974cab942ef3bab2310ab: Status 404 returned error can't find the container with id 1047c4798563adf2cd5fc246d16f0660cfec653f383974cab942ef3bab2310ab Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.930090 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-86m88"] Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.984588 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.986636 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.990708 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.990810 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.991113 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.991669 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.991684 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.991726 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bfks5" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.991827 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 29 03:44:24 crc kubenswrapper[4707]: I0129 03:44:24.992412 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6e14cde-a343-4dc3-b429-77968ac0b7a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6e14cde-a343-4dc3-b429-77968ac0b7a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145367 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7pvj\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-kube-api-access-f7pvj\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.145560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.246412 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6e14cde-a343-4dc3-b429-77968ac0b7a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.246452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6e14cde-a343-4dc3-b429-77968ac0b7a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.246481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.246511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.246564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.246587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.246609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.246631 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.247197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7pvj\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-kube-api-access-f7pvj\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.247333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.247369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.248411 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.248448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.248431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.248493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.249231 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.250762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.255344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.257795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6e14cde-a343-4dc3-b429-77968ac0b7a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.264684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.284342 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.292861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7pvj\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-kube-api-access-f7pvj\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.293516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6e14cde-a343-4dc3-b429-77968ac0b7a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.298057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.298275 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.301927 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.301996 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.302346 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.304258 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-bglwv" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.304373 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.304441 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.305183 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.332813 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.445809 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" event={"ID":"c5b75e3f-4913-45d7-81fa-91bb3c0fca95","Type":"ContainerStarted","Data":"1047c4798563adf2cd5fc246d16f0660cfec653f383974cab942ef3bab2310ab"} Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451373 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqnhc\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-kube-api-access-bqnhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451408 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8dec80d-f976-4316-9d4a-c18cbefe36ba-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8dec80d-f976-4316-9d4a-c18cbefe36ba-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451575 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.451590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.475397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" event={"ID":"804415ac-e69a-4ab4-b372-0743406324eb","Type":"ContainerStarted","Data":"fa7d10989527c37c11810b676208dca2f84ff68d0cad133554a4c6a6b0bac0db"} Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552422 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8dec80d-f976-4316-9d4a-c18cbefe36ba-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552485 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8dec80d-f976-4316-9d4a-c18cbefe36ba-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552653 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqnhc\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-kube-api-access-bqnhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.552765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.553705 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.554006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.554289 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.555126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.563224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8dec80d-f976-4316-9d4a-c18cbefe36ba-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.563523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.564420 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.565246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.594030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8dec80d-f976-4316-9d4a-c18cbefe36ba-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.595838 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.596428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqnhc\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-kube-api-access-bqnhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.609924 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.633324 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 03:44:25 crc kubenswrapper[4707]: I0129 03:44:25.742597 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.139966 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.311297 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 03:44:26 crc kubenswrapper[4707]: W0129 03:44:26.345725 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8dec80d_f976_4316_9d4a_c18cbefe36ba.slice/crio-30e28dd9d5fd9cb817c6bf9759a2dec425a1ed67dde8c6f96b26e7424f9d44d4 WatchSource:0}: Error finding container 30e28dd9d5fd9cb817c6bf9759a2dec425a1ed67dde8c6f96b26e7424f9d44d4: Status 404 returned error can't find the container with id 30e28dd9d5fd9cb817c6bf9759a2dec425a1ed67dde8c6f96b26e7424f9d44d4 Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.490098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6e14cde-a343-4dc3-b429-77968ac0b7a5","Type":"ContainerStarted","Data":"d6b4ebfd47893d08887693be3916da075be659eb379eae895103612113e9a77b"} Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.492943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8dec80d-f976-4316-9d4a-c18cbefe36ba","Type":"ContainerStarted","Data":"30e28dd9d5fd9cb817c6bf9759a2dec425a1ed67dde8c6f96b26e7424f9d44d4"} Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.522843 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.532496 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.536153 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.536487 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.537769 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.538113 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zz9zd" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.538638 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.556327 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.590340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxpp5\" (UniqueName: \"kubernetes.io/projected/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-kube-api-access-xxpp5\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.590442 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-kolla-config\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.590721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.590794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.590821 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.590846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.590978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-config-data-default\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.591017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.697321 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxpp5\" (UniqueName: \"kubernetes.io/projected/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-kube-api-access-xxpp5\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.697416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-kolla-config\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.697450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.697469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.697489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.697508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.697760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-config-data-default\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.697796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.700319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.701141 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-kolla-config\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.705698 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.706972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-config-data-default\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.707401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.707576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.710553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.723512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxpp5\" (UniqueName: \"kubernetes.io/projected/8dbb64e8-99fc-4b59-abdc-fce36a90b82f-kube-api-access-xxpp5\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.739747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"8dbb64e8-99fc-4b59-abdc-fce36a90b82f\") " pod="openstack/openstack-galera-0" Jan 29 03:44:26 crc kubenswrapper[4707]: I0129 03:44:26.896496 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.817093 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.819749 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.824013 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.826752 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.827248 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.827388 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-g2blk" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.836205 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.938428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d6aaab4a-5490-4d22-ac2a-e346a1371683-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.939099 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8fhk\" (UniqueName: \"kubernetes.io/projected/d6aaab4a-5490-4d22-ac2a-e346a1371683-kube-api-access-c8fhk\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.939153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6aaab4a-5490-4d22-ac2a-e346a1371683-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.939208 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6aaab4a-5490-4d22-ac2a-e346a1371683-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.939259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6aaab4a-5490-4d22-ac2a-e346a1371683-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.939297 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.939352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d6aaab4a-5490-4d22-ac2a-e346a1371683-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.939418 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d6aaab4a-5490-4d22-ac2a-e346a1371683-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.972082 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.977376 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.984368 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.984777 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 29 03:44:27 crc kubenswrapper[4707]: I0129 03:44:27.987334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vjfb4" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6aaab4a-5490-4d22-ac2a-e346a1371683-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33afc350-9c09-4d5f-aa86-80ccc0b670ba-config-data\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33afc350-9c09-4d5f-aa86-80ccc0b670ba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042613 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6aaab4a-5490-4d22-ac2a-e346a1371683-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d6aaab4a-5490-4d22-ac2a-e346a1371683-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d6aaab4a-5490-4d22-ac2a-e346a1371683-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33afc350-9c09-4d5f-aa86-80ccc0b670ba-kolla-config\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042808 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtcp8\" (UniqueName: \"kubernetes.io/projected/33afc350-9c09-4d5f-aa86-80ccc0b670ba-kube-api-access-qtcp8\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d6aaab4a-5490-4d22-ac2a-e346a1371683-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8fhk\" (UniqueName: \"kubernetes.io/projected/d6aaab4a-5490-4d22-ac2a-e346a1371683-kube-api-access-c8fhk\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/33afc350-9c09-4d5f-aa86-80ccc0b670ba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.042923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6aaab4a-5490-4d22-ac2a-e346a1371683-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.044206 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.044446 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.047204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d6aaab4a-5490-4d22-ac2a-e346a1371683-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.047884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d6aaab4a-5490-4d22-ac2a-e346a1371683-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.048214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d6aaab4a-5490-4d22-ac2a-e346a1371683-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.053849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6aaab4a-5490-4d22-ac2a-e346a1371683-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.055522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6aaab4a-5490-4d22-ac2a-e346a1371683-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.061894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6aaab4a-5490-4d22-ac2a-e346a1371683-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.072681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8fhk\" (UniqueName: \"kubernetes.io/projected/d6aaab4a-5490-4d22-ac2a-e346a1371683-kube-api-access-c8fhk\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.118807 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d6aaab4a-5490-4d22-ac2a-e346a1371683\") " pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.145782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33afc350-9c09-4d5f-aa86-80ccc0b670ba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.145825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33afc350-9c09-4d5f-aa86-80ccc0b670ba-config-data\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.145890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33afc350-9c09-4d5f-aa86-80ccc0b670ba-kolla-config\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.145940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtcp8\" (UniqueName: \"kubernetes.io/projected/33afc350-9c09-4d5f-aa86-80ccc0b670ba-kube-api-access-qtcp8\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.147211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/33afc350-9c09-4d5f-aa86-80ccc0b670ba-kolla-config\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.147855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33afc350-9c09-4d5f-aa86-80ccc0b670ba-config-data\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.151178 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/33afc350-9c09-4d5f-aa86-80ccc0b670ba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.155505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/33afc350-9c09-4d5f-aa86-80ccc0b670ba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.156128 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.161550 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33afc350-9c09-4d5f-aa86-80ccc0b670ba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.185586 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtcp8\" (UniqueName: \"kubernetes.io/projected/33afc350-9c09-4d5f-aa86-80ccc0b670ba-kube-api-access-qtcp8\") pod \"memcached-0\" (UID: \"33afc350-9c09-4d5f-aa86-80ccc0b670ba\") " pod="openstack/memcached-0" Jan 29 03:44:28 crc kubenswrapper[4707]: I0129 03:44:28.299073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 03:44:29 crc kubenswrapper[4707]: I0129 03:44:29.843007 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 03:44:29 crc kubenswrapper[4707]: I0129 03:44:29.845387 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 03:44:29 crc kubenswrapper[4707]: I0129 03:44:29.848835 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-758v6" Jan 29 03:44:29 crc kubenswrapper[4707]: I0129 03:44:29.859387 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 03:44:29 crc kubenswrapper[4707]: I0129 03:44:29.927617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j5wk\" (UniqueName: \"kubernetes.io/projected/b5a35c97-c2c4-4513-b755-774a90aa56ff-kube-api-access-6j5wk\") pod \"kube-state-metrics-0\" (UID: \"b5a35c97-c2c4-4513-b755-774a90aa56ff\") " pod="openstack/kube-state-metrics-0" Jan 29 03:44:30 crc kubenswrapper[4707]: I0129 03:44:30.029576 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j5wk\" (UniqueName: \"kubernetes.io/projected/b5a35c97-c2c4-4513-b755-774a90aa56ff-kube-api-access-6j5wk\") pod \"kube-state-metrics-0\" (UID: \"b5a35c97-c2c4-4513-b755-774a90aa56ff\") " pod="openstack/kube-state-metrics-0" Jan 29 03:44:30 crc kubenswrapper[4707]: I0129 03:44:30.050385 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j5wk\" (UniqueName: \"kubernetes.io/projected/b5a35c97-c2c4-4513-b755-774a90aa56ff-kube-api-access-6j5wk\") pod \"kube-state-metrics-0\" (UID: \"b5a35c97-c2c4-4513-b755-774a90aa56ff\") " pod="openstack/kube-state-metrics-0" Jan 29 03:44:30 crc kubenswrapper[4707]: I0129 03:44:30.168787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.239082 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hpq5q"] Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.240709 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.244226 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-pwzz4" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.244555 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.244676 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.256933 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hxz2d"] Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.259759 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.286699 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hpq5q"] Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.293946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f831116-140a-4c6b-8d7c-aad99fcaf97c-scripts\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.294001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l66fn\" (UniqueName: \"kubernetes.io/projected/9f831116-140a-4c6b-8d7c-aad99fcaf97c-kube-api-access-l66fn\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.294056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f831116-140a-4c6b-8d7c-aad99fcaf97c-var-run-ovn\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.294370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f831116-140a-4c6b-8d7c-aad99fcaf97c-var-log-ovn\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.294464 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f831116-140a-4c6b-8d7c-aad99fcaf97c-ovn-controller-tls-certs\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.294511 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f831116-140a-4c6b-8d7c-aad99fcaf97c-combined-ca-bundle\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.294856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f831116-140a-4c6b-8d7c-aad99fcaf97c-var-run\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.298797 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hxz2d"] Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f831116-140a-4c6b-8d7c-aad99fcaf97c-var-log-ovn\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397117 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f831116-140a-4c6b-8d7c-aad99fcaf97c-ovn-controller-tls-certs\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f831116-140a-4c6b-8d7c-aad99fcaf97c-combined-ca-bundle\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-var-run\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-var-lib\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r54cn\" (UniqueName: \"kubernetes.io/projected/f25cc401-b568-4936-9947-2a54b5f6dea9-kube-api-access-r54cn\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f831116-140a-4c6b-8d7c-aad99fcaf97c-var-run\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-var-log\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397755 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-etc-ovs\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f831116-140a-4c6b-8d7c-aad99fcaf97c-scripts\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l66fn\" (UniqueName: \"kubernetes.io/projected/9f831116-140a-4c6b-8d7c-aad99fcaf97c-kube-api-access-l66fn\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f831116-140a-4c6b-8d7c-aad99fcaf97c-var-run-ovn\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.397894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f831116-140a-4c6b-8d7c-aad99fcaf97c-var-run\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.398032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f831116-140a-4c6b-8d7c-aad99fcaf97c-var-run-ovn\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.398045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25cc401-b568-4936-9947-2a54b5f6dea9-scripts\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.398758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f831116-140a-4c6b-8d7c-aad99fcaf97c-var-log-ovn\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.401057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f831116-140a-4c6b-8d7c-aad99fcaf97c-scripts\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.406019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f831116-140a-4c6b-8d7c-aad99fcaf97c-ovn-controller-tls-certs\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.412404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f831116-140a-4c6b-8d7c-aad99fcaf97c-combined-ca-bundle\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.420416 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l66fn\" (UniqueName: \"kubernetes.io/projected/9f831116-140a-4c6b-8d7c-aad99fcaf97c-kube-api-access-l66fn\") pod \"ovn-controller-hpq5q\" (UID: \"9f831116-140a-4c6b-8d7c-aad99fcaf97c\") " pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.499857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-var-run\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.500035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-var-lib\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.500094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r54cn\" (UniqueName: \"kubernetes.io/projected/f25cc401-b568-4936-9947-2a54b5f6dea9-kube-api-access-r54cn\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.500152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-var-log\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.500192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-etc-ovs\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.500446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-var-log\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.500125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-var-run\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.500449 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-etc-ovs\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.500365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f25cc401-b568-4936-9947-2a54b5f6dea9-var-lib\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.500831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25cc401-b568-4936-9947-2a54b5f6dea9-scripts\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.504376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25cc401-b568-4936-9947-2a54b5f6dea9-scripts\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.522098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r54cn\" (UniqueName: \"kubernetes.io/projected/f25cc401-b568-4936-9947-2a54b5f6dea9-kube-api-access-r54cn\") pod \"ovn-controller-ovs-hxz2d\" (UID: \"f25cc401-b568-4936-9947-2a54b5f6dea9\") " pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.577135 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.583425 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.930313 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.932132 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.935821 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.936977 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.937474 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.937730 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.937772 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-j558c" Jan 29 03:44:33 crc kubenswrapper[4707]: I0129 03:44:33.949807 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.009346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.009400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.009457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-config\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.009487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.009516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.009724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.009753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.009772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn5wn\" (UniqueName: \"kubernetes.io/projected/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-kube-api-access-jn5wn\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.111607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-config\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.111742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.111810 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.111873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.111932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn5wn\" (UniqueName: \"kubernetes.io/projected/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-kube-api-access-jn5wn\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.111968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.112122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.112176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.117764 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.119117 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.119238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.119264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-config\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.122013 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.124267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.125923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.149572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn5wn\" (UniqueName: \"kubernetes.io/projected/445c0ce8-31bb-4f8a-a139-e1d7a63d38f7-kube-api-access-jn5wn\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.151701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7\") " pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:34 crc kubenswrapper[4707]: I0129 03:44:34.269665 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.380756 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.383295 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.389987 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.390011 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.390132 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-d8lqw" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.390274 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.390604 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.486838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5dee206-46c6-44c4-885d-0d8ba9149bfd-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.486914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5dee206-46c6-44c4-885d-0d8ba9149bfd-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.486946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b5dee206-46c6-44c4-885d-0d8ba9149bfd-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.486977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5dee206-46c6-44c4-885d-0d8ba9149bfd-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.487017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5dee206-46c6-44c4-885d-0d8ba9149bfd-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.487054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dee206-46c6-44c4-885d-0d8ba9149bfd-config\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.487088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r4kw\" (UniqueName: \"kubernetes.io/projected/b5dee206-46c6-44c4-885d-0d8ba9149bfd-kube-api-access-7r4kw\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.487116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.589392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5dee206-46c6-44c4-885d-0d8ba9149bfd-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.589463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5dee206-46c6-44c4-885d-0d8ba9149bfd-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.589495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b5dee206-46c6-44c4-885d-0d8ba9149bfd-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.589518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5dee206-46c6-44c4-885d-0d8ba9149bfd-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.589568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5dee206-46c6-44c4-885d-0d8ba9149bfd-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.589605 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dee206-46c6-44c4-885d-0d8ba9149bfd-config\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.589638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r4kw\" (UniqueName: \"kubernetes.io/projected/b5dee206-46c6-44c4-885d-0d8ba9149bfd-kube-api-access-7r4kw\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.589667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.590111 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.595409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5dee206-46c6-44c4-885d-0d8ba9149bfd-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.598314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dee206-46c6-44c4-885d-0d8ba9149bfd-config\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.599042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5dee206-46c6-44c4-885d-0d8ba9149bfd-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.601172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b5dee206-46c6-44c4-885d-0d8ba9149bfd-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.601922 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5dee206-46c6-44c4-885d-0d8ba9149bfd-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.624784 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5dee206-46c6-44c4-885d-0d8ba9149bfd-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.627268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r4kw\" (UniqueName: \"kubernetes.io/projected/b5dee206-46c6-44c4-885d-0d8ba9149bfd-kube-api-access-7r4kw\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.627885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b5dee206-46c6-44c4-885d-0d8ba9149bfd\") " pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:37 crc kubenswrapper[4707]: I0129 03:44:37.712695 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 03:44:40 crc kubenswrapper[4707]: I0129 03:44:40.447379 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 03:44:40 crc kubenswrapper[4707]: I0129 03:44:40.738073 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 03:44:40 crc kubenswrapper[4707]: I0129 03:44:40.850296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 29 03:44:41 crc kubenswrapper[4707]: I0129 03:44:41.765531 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 03:44:45 crc kubenswrapper[4707]: W0129 03:44:45.685413 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5a35c97_c2c4_4513_b755_774a90aa56ff.slice/crio-e2abadeef61bad37e791079ca4115eef9595733a6078cac8ff6a87cf21ad4575 WatchSource:0}: Error finding container e2abadeef61bad37e791079ca4115eef9595733a6078cac8ff6a87cf21ad4575: Status 404 returned error can't find the container with id e2abadeef61bad37e791079ca4115eef9595733a6078cac8ff6a87cf21ad4575 Jan 29 03:44:45 crc kubenswrapper[4707]: W0129 03:44:45.691474 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33afc350_9c09_4d5f_aa86_80ccc0b670ba.slice/crio-1b880405efd07e6ffbc4e5a0fd120935aa254b6573990e6e56469f80ccc576ab WatchSource:0}: Error finding container 1b880405efd07e6ffbc4e5a0fd120935aa254b6573990e6e56469f80ccc576ab: Status 404 returned error can't find the container with id 1b880405efd07e6ffbc4e5a0fd120935aa254b6573990e6e56469f80ccc576ab Jan 29 03:44:45 crc kubenswrapper[4707]: W0129 03:44:45.700382 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6aaab4a_5490_4d22_ac2a_e346a1371683.slice/crio-9ea077eab6b7f84708b7b913b2ec1f519eeac8adb80260d0ae308fbfab5687ec WatchSource:0}: Error finding container 9ea077eab6b7f84708b7b913b2ec1f519eeac8adb80260d0ae308fbfab5687ec: Status 404 returned error can't find the container with id 9ea077eab6b7f84708b7b913b2ec1f519eeac8adb80260d0ae308fbfab5687ec Jan 29 03:44:45 crc kubenswrapper[4707]: I0129 03:44:45.713524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8dbb64e8-99fc-4b59-abdc-fce36a90b82f","Type":"ContainerStarted","Data":"54cfde07285273b25e051851d0fdcd494b6b16a7a4b8d00e87ee67a7755d5137"} Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.714291 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.714527 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fm95v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-hdk7w_openstack(9585c592-2ad9-4393-9c0d-4dd2b3ae9e98): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.716018 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" podUID="9585c592-2ad9-4393-9c0d-4dd2b3ae9e98" Jan 29 03:44:45 crc kubenswrapper[4707]: I0129 03:44:45.718977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b5a35c97-c2c4-4513-b755-774a90aa56ff","Type":"ContainerStarted","Data":"e2abadeef61bad37e791079ca4115eef9595733a6078cac8ff6a87cf21ad4575"} Jan 29 03:44:45 crc kubenswrapper[4707]: I0129 03:44:45.721132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"33afc350-9c09-4d5f-aa86-80ccc0b670ba","Type":"ContainerStarted","Data":"1b880405efd07e6ffbc4e5a0fd120935aa254b6573990e6e56469f80ccc576ab"} Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.745529 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.745727 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzhjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-86m88_openstack(c5b75e3f-4913-45d7-81fa-91bb3c0fca95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.747283 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" podUID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.860308 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.860805 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8tlf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-tz82g_openstack(b7e66215-b828-4c5a-9f56-3cb79797752d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.862007 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" podUID="b7e66215-b828-4c5a-9f56-3cb79797752d" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.958287 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.958551 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-52hhw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-4wdj8_openstack(804415ac-e69a-4ab4-b372-0743406324eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:44:45 crc kubenswrapper[4707]: E0129 03:44:45.960519 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" podUID="804415ac-e69a-4ab4-b372-0743406324eb" Jan 29 03:44:46 crc kubenswrapper[4707]: I0129 03:44:46.350115 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 03:44:46 crc kubenswrapper[4707]: I0129 03:44:46.408413 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hpq5q"] Jan 29 03:44:46 crc kubenswrapper[4707]: I0129 03:44:46.462645 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 03:44:46 crc kubenswrapper[4707]: W0129 03:44:46.480391 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod445c0ce8_31bb_4f8a_a139_e1d7a63d38f7.slice/crio-52eed94505c38dd88bea591cdc5c80570699b56d2c794720ffc2b9a34dd95310 WatchSource:0}: Error finding container 52eed94505c38dd88bea591cdc5c80570699b56d2c794720ffc2b9a34dd95310: Status 404 returned error can't find the container with id 52eed94505c38dd88bea591cdc5c80570699b56d2c794720ffc2b9a34dd95310 Jan 29 03:44:46 crc kubenswrapper[4707]: W0129 03:44:46.480873 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5dee206_46c6_44c4_885d_0d8ba9149bfd.slice/crio-ae17115d204ecd00014a5d58f55091598a2200c206bd0738af7d1fc56dedbc93 WatchSource:0}: Error finding container ae17115d204ecd00014a5d58f55091598a2200c206bd0738af7d1fc56dedbc93: Status 404 returned error can't find the container with id ae17115d204ecd00014a5d58f55091598a2200c206bd0738af7d1fc56dedbc93 Jan 29 03:44:46 crc kubenswrapper[4707]: I0129 03:44:46.735370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d6aaab4a-5490-4d22-ac2a-e346a1371683","Type":"ContainerStarted","Data":"9ea077eab6b7f84708b7b913b2ec1f519eeac8adb80260d0ae308fbfab5687ec"} Jan 29 03:44:46 crc kubenswrapper[4707]: I0129 03:44:46.736352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7","Type":"ContainerStarted","Data":"52eed94505c38dd88bea591cdc5c80570699b56d2c794720ffc2b9a34dd95310"} Jan 29 03:44:46 crc kubenswrapper[4707]: I0129 03:44:46.750855 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b5dee206-46c6-44c4-885d-0d8ba9149bfd","Type":"ContainerStarted","Data":"ae17115d204ecd00014a5d58f55091598a2200c206bd0738af7d1fc56dedbc93"} Jan 29 03:44:46 crc kubenswrapper[4707]: I0129 03:44:46.767673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q" event={"ID":"9f831116-140a-4c6b-8d7c-aad99fcaf97c","Type":"ContainerStarted","Data":"b54c3f5f1c7eb1ee72e9351c060297c61155816fcf613757a6f3405ba3d8f258"} Jan 29 03:44:46 crc kubenswrapper[4707]: E0129 03:44:46.769697 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" podUID="804415ac-e69a-4ab4-b372-0743406324eb" Jan 29 03:44:46 crc kubenswrapper[4707]: E0129 03:44:46.769910 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" podUID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" Jan 29 03:44:46 crc kubenswrapper[4707]: I0129 03:44:46.936187 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hxz2d"] Jan 29 03:44:47 crc kubenswrapper[4707]: W0129 03:44:47.228269 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25cc401_b568_4936_9947_2a54b5f6dea9.slice/crio-99e027b3cb82e7ddb6ea6e7d185a978d1da04a43aab8fa56667c73ae4217581c WatchSource:0}: Error finding container 99e027b3cb82e7ddb6ea6e7d185a978d1da04a43aab8fa56667c73ae4217581c: Status 404 returned error can't find the container with id 99e027b3cb82e7ddb6ea6e7d185a978d1da04a43aab8fa56667c73ae4217581c Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.305664 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.312069 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.395829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-config\") pod \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\" (UID: \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\") " Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.395905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tlf6\" (UniqueName: \"kubernetes.io/projected/b7e66215-b828-4c5a-9f56-3cb79797752d-kube-api-access-8tlf6\") pod \"b7e66215-b828-4c5a-9f56-3cb79797752d\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.396032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-config\") pod \"b7e66215-b828-4c5a-9f56-3cb79797752d\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.396067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-dns-svc\") pod \"b7e66215-b828-4c5a-9f56-3cb79797752d\" (UID: \"b7e66215-b828-4c5a-9f56-3cb79797752d\") " Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.396096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm95v\" (UniqueName: \"kubernetes.io/projected/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-kube-api-access-fm95v\") pod \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\" (UID: \"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98\") " Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.398452 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7e66215-b828-4c5a-9f56-3cb79797752d" (UID: "b7e66215-b828-4c5a-9f56-3cb79797752d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.399135 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-config" (OuterVolumeSpecName: "config") pod "9585c592-2ad9-4393-9c0d-4dd2b3ae9e98" (UID: "9585c592-2ad9-4393-9c0d-4dd2b3ae9e98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.399947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-config" (OuterVolumeSpecName: "config") pod "b7e66215-b828-4c5a-9f56-3cb79797752d" (UID: "b7e66215-b828-4c5a-9f56-3cb79797752d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.403280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-kube-api-access-fm95v" (OuterVolumeSpecName: "kube-api-access-fm95v") pod "9585c592-2ad9-4393-9c0d-4dd2b3ae9e98" (UID: "9585c592-2ad9-4393-9c0d-4dd2b3ae9e98"). InnerVolumeSpecName "kube-api-access-fm95v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.404216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e66215-b828-4c5a-9f56-3cb79797752d-kube-api-access-8tlf6" (OuterVolumeSpecName: "kube-api-access-8tlf6") pod "b7e66215-b828-4c5a-9f56-3cb79797752d" (UID: "b7e66215-b828-4c5a-9f56-3cb79797752d"). InnerVolumeSpecName "kube-api-access-8tlf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.502964 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.503041 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7e66215-b828-4c5a-9f56-3cb79797752d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.503061 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm95v\" (UniqueName: \"kubernetes.io/projected/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-kube-api-access-fm95v\") on node \"crc\" DevicePath \"\"" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.503077 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.503090 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tlf6\" (UniqueName: \"kubernetes.io/projected/b7e66215-b828-4c5a-9f56-3cb79797752d-kube-api-access-8tlf6\") on node \"crc\" DevicePath \"\"" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.780717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8dec80d-f976-4316-9d4a-c18cbefe36ba","Type":"ContainerStarted","Data":"6d5d6d2e0fa06b8db3118e28e603b5b03f428a0b7fde78b5b540f4727f0a499a"} Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.783059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.783223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tz82g" event={"ID":"b7e66215-b828-4c5a-9f56-3cb79797752d","Type":"ContainerDied","Data":"7c3fe86163ce0f3291291e23769de3782361b65bbeeece8726977a38be7d76ca"} Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.785231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hxz2d" event={"ID":"f25cc401-b568-4936-9947-2a54b5f6dea9","Type":"ContainerStarted","Data":"99e027b3cb82e7ddb6ea6e7d185a978d1da04a43aab8fa56667c73ae4217581c"} Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.787620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6e14cde-a343-4dc3-b429-77968ac0b7a5","Type":"ContainerStarted","Data":"6503da289d12f1bfc63a47f7d284f1b2b501095f2d4223347f4b591fe440389f"} Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.789165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" event={"ID":"9585c592-2ad9-4393-9c0d-4dd2b3ae9e98","Type":"ContainerDied","Data":"1d15635e30ce4cf140082371db89e5f803f9ab51746b00098cc9c52954b13424"} Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.789210 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-hdk7w" Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.848724 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tz82g"] Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.855299 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tz82g"] Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.910362 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hdk7w"] Jan 29 03:44:47 crc kubenswrapper[4707]: I0129 03:44:47.937221 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-hdk7w"] Jan 29 03:44:49 crc kubenswrapper[4707]: I0129 03:44:49.256914 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9585c592-2ad9-4393-9c0d-4dd2b3ae9e98" path="/var/lib/kubelet/pods/9585c592-2ad9-4393-9c0d-4dd2b3ae9e98/volumes" Jan 29 03:44:49 crc kubenswrapper[4707]: I0129 03:44:49.257698 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e66215-b828-4c5a-9f56-3cb79797752d" path="/var/lib/kubelet/pods/b7e66215-b828-4c5a-9f56-3cb79797752d/volumes" Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.846201 4707 generic.go:334] "Generic (PLEG): container finished" podID="f25cc401-b568-4936-9947-2a54b5f6dea9" containerID="da2815c5cf7b3fa1e1f5783b27c281e2a48b6e5c1cc94c48af40d81ffc937ae5" exitCode=0 Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.846333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hxz2d" event={"ID":"f25cc401-b568-4936-9947-2a54b5f6dea9","Type":"ContainerDied","Data":"da2815c5cf7b3fa1e1f5783b27c281e2a48b6e5c1cc94c48af40d81ffc937ae5"} Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.848741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b5a35c97-c2c4-4513-b755-774a90aa56ff","Type":"ContainerStarted","Data":"c066bc913675ca0557b9bd95e6a7ca5b9313106c96308e38ec843795049a8230"} Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.849356 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.851436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q" event={"ID":"9f831116-140a-4c6b-8d7c-aad99fcaf97c","Type":"ContainerStarted","Data":"812ae3b647702ecee3b57adf607f751d04c0dcee075e655bbe2446d2ab08af72"} Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.851603 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hpq5q" Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.853489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"33afc350-9c09-4d5f-aa86-80ccc0b670ba","Type":"ContainerStarted","Data":"78e422b690365278a14ddeb2fbe87cf3323b142cc6c7d8895795c446f6bcc63c"} Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.855594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d6aaab4a-5490-4d22-ac2a-e346a1371683","Type":"ContainerStarted","Data":"c941c12cde2f1bcd1c46887422e189e304871481511503d644fcbb8f87e5d4cf"} Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.857902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7","Type":"ContainerStarted","Data":"2e601d8ae5337e5948af4f100d135c264fb5bc0956216a8e71ed2058fc2c63cf"} Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.860076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8dbb64e8-99fc-4b59-abdc-fce36a90b82f","Type":"ContainerStarted","Data":"e381512c4a425c0e7d3bbea5f07b352c8b71088bddc6ebb260c9302c70041cae"} Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.862123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b5dee206-46c6-44c4-885d-0d8ba9149bfd","Type":"ContainerStarted","Data":"e7045dc82424313ae711f7ef12693a6e94e629e34c6d0ea706259382c27370d3"} Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.946549 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.530014561 podStartE2EDuration="25.946506188s" podCreationTimestamp="2026-01-29 03:44:29 +0000 UTC" firstStartedPulling="2026-01-29 03:44:45.6910806 +0000 UTC m=+1039.175309505" lastFinishedPulling="2026-01-29 03:44:54.107572227 +0000 UTC m=+1047.591801132" observedRunningTime="2026-01-29 03:44:54.918927613 +0000 UTC m=+1048.403156518" watchObservedRunningTime="2026-01-29 03:44:54.946506188 +0000 UTC m=+1048.430735093" Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.981376 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.641769157 podStartE2EDuration="27.981347887s" podCreationTimestamp="2026-01-29 03:44:27 +0000 UTC" firstStartedPulling="2026-01-29 03:44:45.69428621 +0000 UTC m=+1039.178515115" lastFinishedPulling="2026-01-29 03:44:53.03386494 +0000 UTC m=+1046.518093845" observedRunningTime="2026-01-29 03:44:54.970121672 +0000 UTC m=+1048.454350577" watchObservedRunningTime="2026-01-29 03:44:54.981347887 +0000 UTC m=+1048.465576782" Jan 29 03:44:54 crc kubenswrapper[4707]: I0129 03:44:54.993904 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hpq5q" podStartSLOduration=14.496020901 podStartE2EDuration="21.993882629s" podCreationTimestamp="2026-01-29 03:44:33 +0000 UTC" firstStartedPulling="2026-01-29 03:44:46.479159231 +0000 UTC m=+1039.963388136" lastFinishedPulling="2026-01-29 03:44:53.977020959 +0000 UTC m=+1047.461249864" observedRunningTime="2026-01-29 03:44:54.991590855 +0000 UTC m=+1048.475819760" watchObservedRunningTime="2026-01-29 03:44:54.993882629 +0000 UTC m=+1048.478111534" Jan 29 03:44:55 crc kubenswrapper[4707]: I0129 03:44:55.884486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hxz2d" event={"ID":"f25cc401-b568-4936-9947-2a54b5f6dea9","Type":"ContainerStarted","Data":"ad30ce22f2be849bbfa5da84b8a6decb8808f44ab9335b2d4ffafdf9243aa5ca"} Jan 29 03:44:55 crc kubenswrapper[4707]: I0129 03:44:55.885009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hxz2d" event={"ID":"f25cc401-b568-4936-9947-2a54b5f6dea9","Type":"ContainerStarted","Data":"4bc43602ceb04e0c12c17b9c17f05bd1915ba52e301b69415ab0e471ca531baf"} Jan 29 03:44:55 crc kubenswrapper[4707]: I0129 03:44:55.885958 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 29 03:44:55 crc kubenswrapper[4707]: I0129 03:44:55.909422 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hxz2d" podStartSLOduration=16.171017872 podStartE2EDuration="22.909394711s" podCreationTimestamp="2026-01-29 03:44:33 +0000 UTC" firstStartedPulling="2026-01-29 03:44:47.231137779 +0000 UTC m=+1040.715366684" lastFinishedPulling="2026-01-29 03:44:53.969514618 +0000 UTC m=+1047.453743523" observedRunningTime="2026-01-29 03:44:55.904982347 +0000 UTC m=+1049.389211252" watchObservedRunningTime="2026-01-29 03:44:55.909394711 +0000 UTC m=+1049.393623626" Jan 29 03:44:56 crc kubenswrapper[4707]: I0129 03:44:56.894408 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:56 crc kubenswrapper[4707]: I0129 03:44:56.894777 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:44:58 crc kubenswrapper[4707]: I0129 03:44:58.914334 4707 generic.go:334] "Generic (PLEG): container finished" podID="8dbb64e8-99fc-4b59-abdc-fce36a90b82f" containerID="e381512c4a425c0e7d3bbea5f07b352c8b71088bddc6ebb260c9302c70041cae" exitCode=0 Jan 29 03:44:58 crc kubenswrapper[4707]: I0129 03:44:58.914392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8dbb64e8-99fc-4b59-abdc-fce36a90b82f","Type":"ContainerDied","Data":"e381512c4a425c0e7d3bbea5f07b352c8b71088bddc6ebb260c9302c70041cae"} Jan 29 03:44:58 crc kubenswrapper[4707]: I0129 03:44:58.919180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"445c0ce8-31bb-4f8a-a139-e1d7a63d38f7","Type":"ContainerStarted","Data":"a9fb33acda00e5d42fe85bdf384aab10ae5e38761d181220ba48b2ebcb062ea7"} Jan 29 03:44:58 crc kubenswrapper[4707]: I0129 03:44:58.921521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b5dee206-46c6-44c4-885d-0d8ba9149bfd","Type":"ContainerStarted","Data":"30f4fbb043ed082a73fe5021f5eca4f88e03b41f6a7c9aa0be34c93646d231c3"} Jan 29 03:44:58 crc kubenswrapper[4707]: I0129 03:44:58.924472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d6aaab4a-5490-4d22-ac2a-e346a1371683","Type":"ContainerDied","Data":"c941c12cde2f1bcd1c46887422e189e304871481511503d644fcbb8f87e5d4cf"} Jan 29 03:44:58 crc kubenswrapper[4707]: I0129 03:44:58.924422 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6aaab4a-5490-4d22-ac2a-e346a1371683" containerID="c941c12cde2f1bcd1c46887422e189e304871481511503d644fcbb8f87e5d4cf" exitCode=0 Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.020559 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.753838984 podStartE2EDuration="23.020512821s" podCreationTimestamp="2026-01-29 03:44:36 +0000 UTC" firstStartedPulling="2026-01-29 03:44:46.48518007 +0000 UTC m=+1039.969408975" lastFinishedPulling="2026-01-29 03:44:57.751853877 +0000 UTC m=+1051.236082812" observedRunningTime="2026-01-29 03:44:59.01016321 +0000 UTC m=+1052.494392125" watchObservedRunningTime="2026-01-29 03:44:59.020512821 +0000 UTC m=+1052.504741736" Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.045611 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.798700915 podStartE2EDuration="27.045591835s" podCreationTimestamp="2026-01-29 03:44:32 +0000 UTC" firstStartedPulling="2026-01-29 03:44:46.484590304 +0000 UTC m=+1039.968819209" lastFinishedPulling="2026-01-29 03:44:57.731481224 +0000 UTC m=+1051.215710129" observedRunningTime="2026-01-29 03:44:59.030254865 +0000 UTC m=+1052.514483790" watchObservedRunningTime="2026-01-29 03:44:59.045591835 +0000 UTC m=+1052.529820740" Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.270561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.935994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d6aaab4a-5490-4d22-ac2a-e346a1371683","Type":"ContainerStarted","Data":"450ec1e2b7cca3c5eae86009070bec9bce6165da8b90da6b066fc35e4d61ce3f"} Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.939387 4707 generic.go:334] "Generic (PLEG): container finished" podID="804415ac-e69a-4ab4-b372-0743406324eb" containerID="0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb" exitCode=0 Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.939459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" event={"ID":"804415ac-e69a-4ab4-b372-0743406324eb","Type":"ContainerDied","Data":"0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb"} Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.943756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8dbb64e8-99fc-4b59-abdc-fce36a90b82f","Type":"ContainerStarted","Data":"d08129e426ef4bf5952677366e5a0fd1848b44b42f06446a28c3db460f80e438"} Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.946234 4707 generic.go:334] "Generic (PLEG): container finished" podID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" containerID="6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af" exitCode=0 Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.946340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" event={"ID":"c5b75e3f-4913-45d7-81fa-91bb3c0fca95","Type":"ContainerDied","Data":"6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af"} Jan 29 03:44:59 crc kubenswrapper[4707]: I0129 03:44:59.983996 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.729878844 podStartE2EDuration="33.983967199s" podCreationTimestamp="2026-01-29 03:44:26 +0000 UTC" firstStartedPulling="2026-01-29 03:44:45.718249283 +0000 UTC m=+1039.202478178" lastFinishedPulling="2026-01-29 03:44:53.972337588 +0000 UTC m=+1047.456566533" observedRunningTime="2026-01-29 03:44:59.975487911 +0000 UTC m=+1053.459716826" watchObservedRunningTime="2026-01-29 03:44:59.983967199 +0000 UTC m=+1053.468196134" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.012449 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.127779323 podStartE2EDuration="35.012410058s" podCreationTimestamp="2026-01-29 03:44:25 +0000 UTC" firstStartedPulling="2026-01-29 03:44:45.688432925 +0000 UTC m=+1039.172661830" lastFinishedPulling="2026-01-29 03:44:53.57306366 +0000 UTC m=+1047.057292565" observedRunningTime="2026-01-29 03:45:00.007560522 +0000 UTC m=+1053.491789437" watchObservedRunningTime="2026-01-29 03:45:00.012410058 +0000 UTC m=+1053.496638963" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.183851 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.186128 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w"] Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.187840 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.191349 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.191574 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.200018 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w"] Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.259196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afe0e902-d9e3-4b02-b1d9-6f187a075acf-secret-volume\") pod \"collect-profiles-29494305-62h5w\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.259399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afe0e902-d9e3-4b02-b1d9-6f187a075acf-config-volume\") pod \"collect-profiles-29494305-62h5w\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.259505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kxjq\" (UniqueName: \"kubernetes.io/projected/afe0e902-d9e3-4b02-b1d9-6f187a075acf-kube-api-access-2kxjq\") pod \"collect-profiles-29494305-62h5w\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.361058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afe0e902-d9e3-4b02-b1d9-6f187a075acf-config-volume\") pod \"collect-profiles-29494305-62h5w\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.361372 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kxjq\" (UniqueName: \"kubernetes.io/projected/afe0e902-d9e3-4b02-b1d9-6f187a075acf-kube-api-access-2kxjq\") pod \"collect-profiles-29494305-62h5w\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.361495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afe0e902-d9e3-4b02-b1d9-6f187a075acf-secret-volume\") pod \"collect-profiles-29494305-62h5w\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.362754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afe0e902-d9e3-4b02-b1d9-6f187a075acf-config-volume\") pod \"collect-profiles-29494305-62h5w\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.369714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afe0e902-d9e3-4b02-b1d9-6f187a075acf-secret-volume\") pod \"collect-profiles-29494305-62h5w\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.381818 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kxjq\" (UniqueName: \"kubernetes.io/projected/afe0e902-d9e3-4b02-b1d9-6f187a075acf-kube-api-access-2kxjq\") pod \"collect-profiles-29494305-62h5w\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.515843 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.965400 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" event={"ID":"c5b75e3f-4913-45d7-81fa-91bb3c0fca95","Type":"ContainerStarted","Data":"60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f"} Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.967060 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.970464 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" event={"ID":"804415ac-e69a-4ab4-b372-0743406324eb","Type":"ContainerStarted","Data":"1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388"} Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.971261 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.971373 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w"] Jan 29 03:45:00 crc kubenswrapper[4707]: W0129 03:45:00.992581 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafe0e902_d9e3_4b02_b1d9_6f187a075acf.slice/crio-fdf84f45e418158bcb20d3b95aec8a2984b2931ba66ed90c8138c08478a02721 WatchSource:0}: Error finding container fdf84f45e418158bcb20d3b95aec8a2984b2931ba66ed90c8138c08478a02721: Status 404 returned error can't find the container with id fdf84f45e418158bcb20d3b95aec8a2984b2931ba66ed90c8138c08478a02721 Jan 29 03:45:00 crc kubenswrapper[4707]: I0129 03:45:00.998257 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" podStartSLOduration=-9223371999.856539 podStartE2EDuration="36.998237617s" podCreationTimestamp="2026-01-29 03:44:24 +0000 UTC" firstStartedPulling="2026-01-29 03:44:24.932321086 +0000 UTC m=+1018.416549991" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:00.995234892 +0000 UTC m=+1054.479463817" watchObservedRunningTime="2026-01-29 03:45:00.998237617 +0000 UTC m=+1054.482466522" Jan 29 03:45:01 crc kubenswrapper[4707]: I0129 03:45:01.023694 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" podStartSLOduration=4.045776255 podStartE2EDuration="38.023665841s" podCreationTimestamp="2026-01-29 03:44:23 +0000 UTC" firstStartedPulling="2026-01-29 03:44:24.714394343 +0000 UTC m=+1018.198623248" lastFinishedPulling="2026-01-29 03:44:58.692283909 +0000 UTC m=+1052.176512834" observedRunningTime="2026-01-29 03:45:01.017406885 +0000 UTC m=+1054.501635810" watchObservedRunningTime="2026-01-29 03:45:01.023665841 +0000 UTC m=+1054.507894776" Jan 29 03:45:01 crc kubenswrapper[4707]: I0129 03:45:01.270307 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 29 03:45:01 crc kubenswrapper[4707]: I0129 03:45:01.329590 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 29 03:45:01 crc kubenswrapper[4707]: I0129 03:45:01.713109 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 29 03:45:01 crc kubenswrapper[4707]: I0129 03:45:01.758383 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 29 03:45:01 crc kubenswrapper[4707]: I0129 03:45:01.980037 4707 generic.go:334] "Generic (PLEG): container finished" podID="afe0e902-d9e3-4b02-b1d9-6f187a075acf" containerID="c77de9cdb6984bec3614db4d649beebf0d2a69ccf634e03a30a37f19f060266b" exitCode=0 Jan 29 03:45:01 crc kubenswrapper[4707]: I0129 03:45:01.980095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" event={"ID":"afe0e902-d9e3-4b02-b1d9-6f187a075acf","Type":"ContainerDied","Data":"c77de9cdb6984bec3614db4d649beebf0d2a69ccf634e03a30a37f19f060266b"} Jan 29 03:45:01 crc kubenswrapper[4707]: I0129 03:45:01.980925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" event={"ID":"afe0e902-d9e3-4b02-b1d9-6f187a075acf","Type":"ContainerStarted","Data":"fdf84f45e418158bcb20d3b95aec8a2984b2931ba66ed90c8138c08478a02721"} Jan 29 03:45:01 crc kubenswrapper[4707]: I0129 03:45:01.982042 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.032324 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.038028 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.250455 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4wdj8"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.307139 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mbr2q"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.311712 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.315117 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.318644 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6rw89"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.319636 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.321116 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.336917 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mbr2q"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.351345 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6rw89"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400427 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc182e0-848e-43b3-8d1e-920440755bca-config\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400515 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc182e0-848e-43b3-8d1e-920440755bca-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3bc182e0-848e-43b3-8d1e-920440755bca-ovn-rundir\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56lgp\" (UniqueName: \"kubernetes.io/projected/3bc182e0-848e-43b3-8d1e-920440755bca-kube-api-access-56lgp\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-672jm\" (UniqueName: \"kubernetes.io/projected/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-kube-api-access-672jm\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc182e0-848e-43b3-8d1e-920440755bca-combined-ca-bundle\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3bc182e0-848e-43b3-8d1e-920440755bca-ovs-rundir\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.400805 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-config\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.406471 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.408161 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.415562 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.415790 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.423370 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.424062 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lh9vg" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.429694 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-86m88"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.440939 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.494248 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vm5f5"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.495810 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.497784 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.501317 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vm5f5"] Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc182e0-848e-43b3-8d1e-920440755bca-config\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc182e0-848e-43b3-8d1e-920440755bca-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3bc182e0-848e-43b3-8d1e-920440755bca-ovn-rundir\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b44fdd-6478-42a0-9817-b3d949683532-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56lgp\" (UniqueName: \"kubernetes.io/projected/3bc182e0-848e-43b3-8d1e-920440755bca-kube-api-access-56lgp\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502662 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r656c\" (UniqueName: \"kubernetes.io/projected/79b44fdd-6478-42a0-9817-b3d949683532-kube-api-access-r656c\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b44fdd-6478-42a0-9817-b3d949683532-config\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79b44fdd-6478-42a0-9817-b3d949683532-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-672jm\" (UniqueName: \"kubernetes.io/projected/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-kube-api-access-672jm\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc182e0-848e-43b3-8d1e-920440755bca-combined-ca-bundle\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502809 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3bc182e0-848e-43b3-8d1e-920440755bca-ovs-rundir\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b44fdd-6478-42a0-9817-b3d949683532-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-config\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b44fdd-6478-42a0-9817-b3d949683532-scripts\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.502934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b44fdd-6478-42a0-9817-b3d949683532-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.503270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc182e0-848e-43b3-8d1e-920440755bca-config\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.503444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3bc182e0-848e-43b3-8d1e-920440755bca-ovn-rundir\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.503634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3bc182e0-848e-43b3-8d1e-920440755bca-ovs-rundir\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.504487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.504640 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-config\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.504648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.512512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc182e0-848e-43b3-8d1e-920440755bca-combined-ca-bundle\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.513054 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bc182e0-848e-43b3-8d1e-920440755bca-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.532698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-672jm\" (UniqueName: \"kubernetes.io/projected/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-kube-api-access-672jm\") pod \"dnsmasq-dns-7f896c8c65-mbr2q\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.541103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56lgp\" (UniqueName: \"kubernetes.io/projected/3bc182e0-848e-43b3-8d1e-920440755bca-kube-api-access-56lgp\") pod \"ovn-controller-metrics-6rw89\" (UID: \"3bc182e0-848e-43b3-8d1e-920440755bca\") " pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r656c\" (UniqueName: \"kubernetes.io/projected/79b44fdd-6478-42a0-9817-b3d949683532-kube-api-access-r656c\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605183 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b44fdd-6478-42a0-9817-b3d949683532-config\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79b44fdd-6478-42a0-9817-b3d949683532-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605240 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605277 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b44fdd-6478-42a0-9817-b3d949683532-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b44fdd-6478-42a0-9817-b3d949683532-scripts\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b44fdd-6478-42a0-9817-b3d949683532-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q4ff\" (UniqueName: \"kubernetes.io/projected/1a97493c-cef6-4cf6-8c16-76864d7c24cc-kube-api-access-2q4ff\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-config\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.605453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b44fdd-6478-42a0-9817-b3d949683532-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.606831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79b44fdd-6478-42a0-9817-b3d949683532-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.607372 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b44fdd-6478-42a0-9817-b3d949683532-scripts\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.607398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b44fdd-6478-42a0-9817-b3d949683532-config\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.610520 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b44fdd-6478-42a0-9817-b3d949683532-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.611078 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b44fdd-6478-42a0-9817-b3d949683532-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.611261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b44fdd-6478-42a0-9817-b3d949683532-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.626424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r656c\" (UniqueName: \"kubernetes.io/projected/79b44fdd-6478-42a0-9817-b3d949683532-kube-api-access-r656c\") pod \"ovn-northd-0\" (UID: \"79b44fdd-6478-42a0-9817-b3d949683532\") " pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.640643 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.661265 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6rw89" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.706640 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-config\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.706736 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.706759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.706775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.706833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q4ff\" (UniqueName: \"kubernetes.io/projected/1a97493c-cef6-4cf6-8c16-76864d7c24cc-kube-api-access-2q4ff\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.708439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.708452 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-config\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.708832 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.710094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.730076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q4ff\" (UniqueName: \"kubernetes.io/projected/1a97493c-cef6-4cf6-8c16-76864d7c24cc-kube-api-access-2q4ff\") pod \"dnsmasq-dns-86db49b7ff-vm5f5\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.737034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.867978 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.989976 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" podUID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" containerName="dnsmasq-dns" containerID="cri-o://60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f" gracePeriod=10 Jan 29 03:45:02 crc kubenswrapper[4707]: I0129 03:45:02.990194 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" podUID="804415ac-e69a-4ab4-b372-0743406324eb" containerName="dnsmasq-dns" containerID="cri-o://1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388" gracePeriod=10 Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.160664 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mbr2q"] Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.261971 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.267797 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6rw89"] Jan 29 03:45:03 crc kubenswrapper[4707]: W0129 03:45:03.277812 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc182e0_848e_43b3_8d1e_920440755bca.slice/crio-490245f42d819e51a401ab95c568a1011b23980bc988a3cc86684fb1fd0d2fe1 WatchSource:0}: Error finding container 490245f42d819e51a401ab95c568a1011b23980bc988a3cc86684fb1fd0d2fe1: Status 404 returned error can't find the container with id 490245f42d819e51a401ab95c568a1011b23980bc988a3cc86684fb1fd0d2fe1 Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.300757 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.413337 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.422235 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vm5f5"] Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.544633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afe0e902-d9e3-4b02-b1d9-6f187a075acf-config-volume\") pod \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.544754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kxjq\" (UniqueName: \"kubernetes.io/projected/afe0e902-d9e3-4b02-b1d9-6f187a075acf-kube-api-access-2kxjq\") pod \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.544791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afe0e902-d9e3-4b02-b1d9-6f187a075acf-secret-volume\") pod \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\" (UID: \"afe0e902-d9e3-4b02-b1d9-6f187a075acf\") " Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.545802 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afe0e902-d9e3-4b02-b1d9-6f187a075acf-config-volume" (OuterVolumeSpecName: "config-volume") pod "afe0e902-d9e3-4b02-b1d9-6f187a075acf" (UID: "afe0e902-d9e3-4b02-b1d9-6f187a075acf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.552149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe0e902-d9e3-4b02-b1d9-6f187a075acf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "afe0e902-d9e3-4b02-b1d9-6f187a075acf" (UID: "afe0e902-d9e3-4b02-b1d9-6f187a075acf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.558470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe0e902-d9e3-4b02-b1d9-6f187a075acf-kube-api-access-2kxjq" (OuterVolumeSpecName: "kube-api-access-2kxjq") pod "afe0e902-d9e3-4b02-b1d9-6f187a075acf" (UID: "afe0e902-d9e3-4b02-b1d9-6f187a075acf"). InnerVolumeSpecName "kube-api-access-2kxjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.647080 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afe0e902-d9e3-4b02-b1d9-6f187a075acf-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.647124 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kxjq\" (UniqueName: \"kubernetes.io/projected/afe0e902-d9e3-4b02-b1d9-6f187a075acf-kube-api-access-2kxjq\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.647140 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afe0e902-d9e3-4b02-b1d9-6f187a075acf-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.746048 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.751111 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.850831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-config\") pod \"804415ac-e69a-4ab4-b372-0743406324eb\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.850908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzhjb\" (UniqueName: \"kubernetes.io/projected/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-kube-api-access-hzhjb\") pod \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.850944 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-dns-svc\") pod \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.850979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-config\") pod \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\" (UID: \"c5b75e3f-4913-45d7-81fa-91bb3c0fca95\") " Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.851189 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-dns-svc\") pod \"804415ac-e69a-4ab4-b372-0743406324eb\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.851249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52hhw\" (UniqueName: \"kubernetes.io/projected/804415ac-e69a-4ab4-b372-0743406324eb-kube-api-access-52hhw\") pod \"804415ac-e69a-4ab4-b372-0743406324eb\" (UID: \"804415ac-e69a-4ab4-b372-0743406324eb\") " Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.858811 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804415ac-e69a-4ab4-b372-0743406324eb-kube-api-access-52hhw" (OuterVolumeSpecName: "kube-api-access-52hhw") pod "804415ac-e69a-4ab4-b372-0743406324eb" (UID: "804415ac-e69a-4ab4-b372-0743406324eb"). InnerVolumeSpecName "kube-api-access-52hhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.859777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-kube-api-access-hzhjb" (OuterVolumeSpecName: "kube-api-access-hzhjb") pod "c5b75e3f-4913-45d7-81fa-91bb3c0fca95" (UID: "c5b75e3f-4913-45d7-81fa-91bb3c0fca95"). InnerVolumeSpecName "kube-api-access-hzhjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.893224 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5b75e3f-4913-45d7-81fa-91bb3c0fca95" (UID: "c5b75e3f-4913-45d7-81fa-91bb3c0fca95"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.898949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-config" (OuterVolumeSpecName: "config") pod "804415ac-e69a-4ab4-b372-0743406324eb" (UID: "804415ac-e69a-4ab4-b372-0743406324eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.899957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-config" (OuterVolumeSpecName: "config") pod "c5b75e3f-4913-45d7-81fa-91bb3c0fca95" (UID: "c5b75e3f-4913-45d7-81fa-91bb3c0fca95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.918857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "804415ac-e69a-4ab4-b372-0743406324eb" (UID: "804415ac-e69a-4ab4-b372-0743406324eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.953320 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.953368 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzhjb\" (UniqueName: \"kubernetes.io/projected/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-kube-api-access-hzhjb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.953384 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.953398 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5b75e3f-4913-45d7-81fa-91bb3c0fca95-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.953410 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/804415ac-e69a-4ab4-b372-0743406324eb-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:03 crc kubenswrapper[4707]: I0129 03:45:03.953422 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52hhw\" (UniqueName: \"kubernetes.io/projected/804415ac-e69a-4ab4-b372-0743406324eb-kube-api-access-52hhw\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.005753 4707 generic.go:334] "Generic (PLEG): container finished" podID="5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" containerID="3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7" exitCode=0 Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.005804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" event={"ID":"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15","Type":"ContainerDied","Data":"3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.005873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" event={"ID":"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15","Type":"ContainerStarted","Data":"d6e5ffe59ac9ed6c22da79334796a687737164442ed958d233a583a05c55ac87"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.009510 4707 generic.go:334] "Generic (PLEG): container finished" podID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" containerID="60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f" exitCode=0 Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.009641 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" event={"ID":"c5b75e3f-4913-45d7-81fa-91bb3c0fca95","Type":"ContainerDied","Data":"60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.009667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" event={"ID":"c5b75e3f-4913-45d7-81fa-91bb3c0fca95","Type":"ContainerDied","Data":"1047c4798563adf2cd5fc246d16f0660cfec653f383974cab942ef3bab2310ab"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.009620 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-86m88" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.009729 4707 scope.go:117] "RemoveContainer" containerID="60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.012503 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a97493c-cef6-4cf6-8c16-76864d7c24cc" containerID="16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468" exitCode=0 Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.012632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" event={"ID":"1a97493c-cef6-4cf6-8c16-76864d7c24cc","Type":"ContainerDied","Data":"16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.012678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" event={"ID":"1a97493c-cef6-4cf6-8c16-76864d7c24cc","Type":"ContainerStarted","Data":"115b4f6808c9e902ceb345171f5ec9560914cd26c8554282d95cbc4f12003340"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.014373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"79b44fdd-6478-42a0-9817-b3d949683532","Type":"ContainerStarted","Data":"db462df3ab2450aefe28aba1255f87812b5d04ce2a9f055ee38af12433f7e132"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.019704 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" event={"ID":"afe0e902-d9e3-4b02-b1d9-6f187a075acf","Type":"ContainerDied","Data":"fdf84f45e418158bcb20d3b95aec8a2984b2931ba66ed90c8138c08478a02721"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.019745 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdf84f45e418158bcb20d3b95aec8a2984b2931ba66ed90c8138c08478a02721" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.019813 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.023079 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6rw89" event={"ID":"3bc182e0-848e-43b3-8d1e-920440755bca","Type":"ContainerStarted","Data":"7a8d593f3412c0c22266a3108f4ec6c291cc73c1f429a0722e738b385bf93a86"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.023515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6rw89" event={"ID":"3bc182e0-848e-43b3-8d1e-920440755bca","Type":"ContainerStarted","Data":"490245f42d819e51a401ab95c568a1011b23980bc988a3cc86684fb1fd0d2fe1"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.025709 4707 generic.go:334] "Generic (PLEG): container finished" podID="804415ac-e69a-4ab4-b372-0743406324eb" containerID="1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388" exitCode=0 Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.028343 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.029425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" event={"ID":"804415ac-e69a-4ab4-b372-0743406324eb","Type":"ContainerDied","Data":"1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.029531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-4wdj8" event={"ID":"804415ac-e69a-4ab4-b372-0743406324eb","Type":"ContainerDied","Data":"fa7d10989527c37c11810b676208dca2f84ff68d0cad133554a4c6a6b0bac0db"} Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.043917 4707 scope.go:117] "RemoveContainer" containerID="6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.066146 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6rw89" podStartSLOduration=2.066121801 podStartE2EDuration="2.066121801s" podCreationTimestamp="2026-01-29 03:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:04.054177316 +0000 UTC m=+1057.538406231" watchObservedRunningTime="2026-01-29 03:45:04.066121801 +0000 UTC m=+1057.550350706" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.136559 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4wdj8"] Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.150008 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-4wdj8"] Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.155255 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-86m88"] Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.160646 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-86m88"] Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.365972 4707 scope.go:117] "RemoveContainer" containerID="60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f" Jan 29 03:45:04 crc kubenswrapper[4707]: E0129 03:45:04.368851 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f\": container with ID starting with 60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f not found: ID does not exist" containerID="60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.368944 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f"} err="failed to get container status \"60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f\": rpc error: code = NotFound desc = could not find container \"60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f\": container with ID starting with 60a897e3e5c43fd6897f3316c7171c1b253d6f505ac78c18cd6ffb6c94e8130f not found: ID does not exist" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.368987 4707 scope.go:117] "RemoveContainer" containerID="6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af" Jan 29 03:45:04 crc kubenswrapper[4707]: E0129 03:45:04.369787 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af\": container with ID starting with 6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af not found: ID does not exist" containerID="6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.369843 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af"} err="failed to get container status \"6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af\": rpc error: code = NotFound desc = could not find container \"6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af\": container with ID starting with 6db6a8ba31b39f38a0354199c48b81edd86f60adb030abbb14e20cdc3faa04af not found: ID does not exist" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.369884 4707 scope.go:117] "RemoveContainer" containerID="1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.429378 4707 scope.go:117] "RemoveContainer" containerID="0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.489092 4707 scope.go:117] "RemoveContainer" containerID="1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388" Jan 29 03:45:04 crc kubenswrapper[4707]: E0129 03:45:04.489616 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388\": container with ID starting with 1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388 not found: ID does not exist" containerID="1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.489661 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388"} err="failed to get container status \"1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388\": rpc error: code = NotFound desc = could not find container \"1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388\": container with ID starting with 1c970cb58e06d7dc54f4900595302c73cf14485a1e972287882b3c3543897388 not found: ID does not exist" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.489688 4707 scope.go:117] "RemoveContainer" containerID="0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb" Jan 29 03:45:04 crc kubenswrapper[4707]: E0129 03:45:04.490153 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb\": container with ID starting with 0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb not found: ID does not exist" containerID="0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb" Jan 29 03:45:04 crc kubenswrapper[4707]: I0129 03:45:04.490228 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb"} err="failed to get container status \"0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb\": rpc error: code = NotFound desc = could not find container \"0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb\": container with ID starting with 0f4ad2327076c0904cb98daa35169bd4e7b21fe10dafeecf001c5cc7889cb5cb not found: ID does not exist" Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.061192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" event={"ID":"1a97493c-cef6-4cf6-8c16-76864d7c24cc","Type":"ContainerStarted","Data":"5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd"} Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.061703 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.064081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"79b44fdd-6478-42a0-9817-b3d949683532","Type":"ContainerStarted","Data":"439b7e8349f333dd5245c15954dd7b7f4341baf06a084f4997c8e2473a9f4b40"} Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.064136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"79b44fdd-6478-42a0-9817-b3d949683532","Type":"ContainerStarted","Data":"300bb574b9917f3a9355e961e7ca1441809a9e955154cabbd6cbaddc867f844d"} Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.064573 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.069465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" event={"ID":"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15","Type":"ContainerStarted","Data":"e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d"} Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.069749 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.086449 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" podStartSLOduration=3.086408608 podStartE2EDuration="3.086408608s" podCreationTimestamp="2026-01-29 03:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:05.083887647 +0000 UTC m=+1058.568116552" watchObservedRunningTime="2026-01-29 03:45:05.086408608 +0000 UTC m=+1058.570637503" Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.112726 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.8839077309999999 podStartE2EDuration="3.112700386s" podCreationTimestamp="2026-01-29 03:45:02 +0000 UTC" firstStartedPulling="2026-01-29 03:45:03.264692544 +0000 UTC m=+1056.748921449" lastFinishedPulling="2026-01-29 03:45:04.493485199 +0000 UTC m=+1057.977714104" observedRunningTime="2026-01-29 03:45:05.111310507 +0000 UTC m=+1058.595539442" watchObservedRunningTime="2026-01-29 03:45:05.112700386 +0000 UTC m=+1058.596929281" Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.141037 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" podStartSLOduration=3.141001851 podStartE2EDuration="3.141001851s" podCreationTimestamp="2026-01-29 03:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:05.134740696 +0000 UTC m=+1058.618969691" watchObservedRunningTime="2026-01-29 03:45:05.141001851 +0000 UTC m=+1058.625230786" Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.255237 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804415ac-e69a-4ab4-b372-0743406324eb" path="/var/lib/kubelet/pods/804415ac-e69a-4ab4-b372-0743406324eb/volumes" Jan 29 03:45:05 crc kubenswrapper[4707]: I0129 03:45:05.256141 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" path="/var/lib/kubelet/pods/c5b75e3f-4913-45d7-81fa-91bb3c0fca95/volumes" Jan 29 03:45:06 crc kubenswrapper[4707]: I0129 03:45:06.897056 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 29 03:45:06 crc kubenswrapper[4707]: I0129 03:45:06.898603 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 29 03:45:06 crc kubenswrapper[4707]: I0129 03:45:06.988695 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 29 03:45:07 crc kubenswrapper[4707]: I0129 03:45:07.168388 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.059716 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-66c3-account-create-update-gzzxt"] Jan 29 03:45:08 crc kubenswrapper[4707]: E0129 03:45:08.060218 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" containerName="init" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.060236 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" containerName="init" Jan 29 03:45:08 crc kubenswrapper[4707]: E0129 03:45:08.060277 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804415ac-e69a-4ab4-b372-0743406324eb" containerName="init" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.060284 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="804415ac-e69a-4ab4-b372-0743406324eb" containerName="init" Jan 29 03:45:08 crc kubenswrapper[4707]: E0129 03:45:08.060304 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" containerName="dnsmasq-dns" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.060312 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" containerName="dnsmasq-dns" Jan 29 03:45:08 crc kubenswrapper[4707]: E0129 03:45:08.060344 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe0e902-d9e3-4b02-b1d9-6f187a075acf" containerName="collect-profiles" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.060352 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe0e902-d9e3-4b02-b1d9-6f187a075acf" containerName="collect-profiles" Jan 29 03:45:08 crc kubenswrapper[4707]: E0129 03:45:08.060378 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804415ac-e69a-4ab4-b372-0743406324eb" containerName="dnsmasq-dns" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.060385 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="804415ac-e69a-4ab4-b372-0743406324eb" containerName="dnsmasq-dns" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.060607 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b75e3f-4913-45d7-81fa-91bb3c0fca95" containerName="dnsmasq-dns" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.060629 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe0e902-d9e3-4b02-b1d9-6f187a075acf" containerName="collect-profiles" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.060641 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="804415ac-e69a-4ab4-b372-0743406324eb" containerName="dnsmasq-dns" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.061379 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.066971 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.072614 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hxjqt"] Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.074260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.078759 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-66c3-account-create-update-gzzxt"] Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.106486 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hxjqt"] Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.151025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0826415-3b19-4838-b645-1d5e36ba6e16-operator-scripts\") pod \"keystone-db-create-hxjqt\" (UID: \"c0826415-3b19-4838-b645-1d5e36ba6e16\") " pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.151116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tbkt\" (UniqueName: \"kubernetes.io/projected/ba70b464-3262-4a27-b710-6ec145fc1a8f-kube-api-access-5tbkt\") pod \"keystone-66c3-account-create-update-gzzxt\" (UID: \"ba70b464-3262-4a27-b710-6ec145fc1a8f\") " pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.151385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba70b464-3262-4a27-b710-6ec145fc1a8f-operator-scripts\") pod \"keystone-66c3-account-create-update-gzzxt\" (UID: \"ba70b464-3262-4a27-b710-6ec145fc1a8f\") " pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.151444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdsx5\" (UniqueName: \"kubernetes.io/projected/c0826415-3b19-4838-b645-1d5e36ba6e16-kube-api-access-qdsx5\") pod \"keystone-db-create-hxjqt\" (UID: \"c0826415-3b19-4838-b645-1d5e36ba6e16\") " pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.158477 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.158558 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.232763 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-brvzb"] Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.236508 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-brvzb" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.244440 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-brvzb"] Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.253018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0826415-3b19-4838-b645-1d5e36ba6e16-operator-scripts\") pod \"keystone-db-create-hxjqt\" (UID: \"c0826415-3b19-4838-b645-1d5e36ba6e16\") " pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.253085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tbkt\" (UniqueName: \"kubernetes.io/projected/ba70b464-3262-4a27-b710-6ec145fc1a8f-kube-api-access-5tbkt\") pod \"keystone-66c3-account-create-update-gzzxt\" (UID: \"ba70b464-3262-4a27-b710-6ec145fc1a8f\") " pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.253196 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba70b464-3262-4a27-b710-6ec145fc1a8f-operator-scripts\") pod \"keystone-66c3-account-create-update-gzzxt\" (UID: \"ba70b464-3262-4a27-b710-6ec145fc1a8f\") " pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.253234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdsx5\" (UniqueName: \"kubernetes.io/projected/c0826415-3b19-4838-b645-1d5e36ba6e16-kube-api-access-qdsx5\") pod \"keystone-db-create-hxjqt\" (UID: \"c0826415-3b19-4838-b645-1d5e36ba6e16\") " pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.254012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0826415-3b19-4838-b645-1d5e36ba6e16-operator-scripts\") pod \"keystone-db-create-hxjqt\" (UID: \"c0826415-3b19-4838-b645-1d5e36ba6e16\") " pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.254736 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba70b464-3262-4a27-b710-6ec145fc1a8f-operator-scripts\") pod \"keystone-66c3-account-create-update-gzzxt\" (UID: \"ba70b464-3262-4a27-b710-6ec145fc1a8f\") " pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.263984 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.279043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tbkt\" (UniqueName: \"kubernetes.io/projected/ba70b464-3262-4a27-b710-6ec145fc1a8f-kube-api-access-5tbkt\") pod \"keystone-66c3-account-create-update-gzzxt\" (UID: \"ba70b464-3262-4a27-b710-6ec145fc1a8f\") " pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.280869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdsx5\" (UniqueName: \"kubernetes.io/projected/c0826415-3b19-4838-b645-1d5e36ba6e16-kube-api-access-qdsx5\") pod \"keystone-db-create-hxjqt\" (UID: \"c0826415-3b19-4838-b645-1d5e36ba6e16\") " pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.355376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cea5dce-4548-46ba-b461-8c98a63a7daf-operator-scripts\") pod \"placement-db-create-brvzb\" (UID: \"3cea5dce-4548-46ba-b461-8c98a63a7daf\") " pod="openstack/placement-db-create-brvzb" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.355815 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxnh8\" (UniqueName: \"kubernetes.io/projected/3cea5dce-4548-46ba-b461-8c98a63a7daf-kube-api-access-gxnh8\") pod \"placement-db-create-brvzb\" (UID: \"3cea5dce-4548-46ba-b461-8c98a63a7daf\") " pod="openstack/placement-db-create-brvzb" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.415240 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.429855 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.449697 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0e72-account-create-update-shtrl"] Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.451655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.454363 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.458092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxnh8\" (UniqueName: \"kubernetes.io/projected/3cea5dce-4548-46ba-b461-8c98a63a7daf-kube-api-access-gxnh8\") pod \"placement-db-create-brvzb\" (UID: \"3cea5dce-4548-46ba-b461-8c98a63a7daf\") " pod="openstack/placement-db-create-brvzb" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.458605 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cea5dce-4548-46ba-b461-8c98a63a7daf-operator-scripts\") pod \"placement-db-create-brvzb\" (UID: \"3cea5dce-4548-46ba-b461-8c98a63a7daf\") " pod="openstack/placement-db-create-brvzb" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.459481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cea5dce-4548-46ba-b461-8c98a63a7daf-operator-scripts\") pod \"placement-db-create-brvzb\" (UID: \"3cea5dce-4548-46ba-b461-8c98a63a7daf\") " pod="openstack/placement-db-create-brvzb" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.481185 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0e72-account-create-update-shtrl"] Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.486746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxnh8\" (UniqueName: \"kubernetes.io/projected/3cea5dce-4548-46ba-b461-8c98a63a7daf-kube-api-access-gxnh8\") pod \"placement-db-create-brvzb\" (UID: \"3cea5dce-4548-46ba-b461-8c98a63a7daf\") " pod="openstack/placement-db-create-brvzb" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.553038 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-brvzb" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.565438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdzc5\" (UniqueName: \"kubernetes.io/projected/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-kube-api-access-xdzc5\") pod \"placement-0e72-account-create-update-shtrl\" (UID: \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\") " pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.565492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-operator-scripts\") pod \"placement-0e72-account-create-update-shtrl\" (UID: \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\") " pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.668049 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdzc5\" (UniqueName: \"kubernetes.io/projected/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-kube-api-access-xdzc5\") pod \"placement-0e72-account-create-update-shtrl\" (UID: \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\") " pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.668490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-operator-scripts\") pod \"placement-0e72-account-create-update-shtrl\" (UID: \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\") " pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.669709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-operator-scripts\") pod \"placement-0e72-account-create-update-shtrl\" (UID: \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\") " pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.694375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdzc5\" (UniqueName: \"kubernetes.io/projected/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-kube-api-access-xdzc5\") pod \"placement-0e72-account-create-update-shtrl\" (UID: \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\") " pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.858326 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.886353 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-brvzb"] Jan 29 03:45:08 crc kubenswrapper[4707]: W0129 03:45:08.899370 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cea5dce_4548_46ba_b461_8c98a63a7daf.slice/crio-9ed9021ceee769d276715567d24d983c8dc799b20effc73a0caadf90824ccde5 WatchSource:0}: Error finding container 9ed9021ceee769d276715567d24d983c8dc799b20effc73a0caadf90824ccde5: Status 404 returned error can't find the container with id 9ed9021ceee769d276715567d24d983c8dc799b20effc73a0caadf90824ccde5 Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.966599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-66c3-account-create-update-gzzxt"] Jan 29 03:45:08 crc kubenswrapper[4707]: I0129 03:45:08.983499 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hxjqt"] Jan 29 03:45:09 crc kubenswrapper[4707]: I0129 03:45:09.117400 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-brvzb" event={"ID":"3cea5dce-4548-46ba-b461-8c98a63a7daf","Type":"ContainerStarted","Data":"4eb5b45b2d80302bb040d1a4c3b6896389f5ad7a1867de1d8a3b1723b93a8f69"} Jan 29 03:45:09 crc kubenswrapper[4707]: I0129 03:45:09.117912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-brvzb" event={"ID":"3cea5dce-4548-46ba-b461-8c98a63a7daf","Type":"ContainerStarted","Data":"9ed9021ceee769d276715567d24d983c8dc799b20effc73a0caadf90824ccde5"} Jan 29 03:45:09 crc kubenswrapper[4707]: I0129 03:45:09.119773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hxjqt" event={"ID":"c0826415-3b19-4838-b645-1d5e36ba6e16","Type":"ContainerStarted","Data":"f72676f101cadc4fd2c00415e3dcd8ca889486a1e9caf1982d7d22cc2b130428"} Jan 29 03:45:09 crc kubenswrapper[4707]: I0129 03:45:09.125034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66c3-account-create-update-gzzxt" event={"ID":"ba70b464-3262-4a27-b710-6ec145fc1a8f","Type":"ContainerStarted","Data":"64bee9f9512f153c32e636b019db972252fec50143dd474af30933f3bf4d05db"} Jan 29 03:45:09 crc kubenswrapper[4707]: I0129 03:45:09.261174 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 29 03:45:09 crc kubenswrapper[4707]: I0129 03:45:09.294232 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-brvzb" podStartSLOduration=1.294205819 podStartE2EDuration="1.294205819s" podCreationTimestamp="2026-01-29 03:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:09.145335837 +0000 UTC m=+1062.629564742" watchObservedRunningTime="2026-01-29 03:45:09.294205819 +0000 UTC m=+1062.778434724" Jan 29 03:45:09 crc kubenswrapper[4707]: I0129 03:45:09.365035 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0e72-account-create-update-shtrl"] Jan 29 03:45:09 crc kubenswrapper[4707]: W0129 03:45:09.391213 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb2a1269_a03b_4a5f_b61e_ad1f94576f9e.slice/crio-b76411d1acde9ebfbd5409b4aa10bfb6a067aa472d9ed7b804d4419bf7651b97 WatchSource:0}: Error finding container b76411d1acde9ebfbd5409b4aa10bfb6a067aa472d9ed7b804d4419bf7651b97: Status 404 returned error can't find the container with id b76411d1acde9ebfbd5409b4aa10bfb6a067aa472d9ed7b804d4419bf7651b97 Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.153790 4707 generic.go:334] "Generic (PLEG): container finished" podID="c0826415-3b19-4838-b645-1d5e36ba6e16" containerID="6aa2773ce2855195c8d89e118373e54b6473d11c286b519b57fa479193263ce4" exitCode=0 Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.154032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hxjqt" event={"ID":"c0826415-3b19-4838-b645-1d5e36ba6e16","Type":"ContainerDied","Data":"6aa2773ce2855195c8d89e118373e54b6473d11c286b519b57fa479193263ce4"} Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.171908 4707 generic.go:334] "Generic (PLEG): container finished" podID="ba70b464-3262-4a27-b710-6ec145fc1a8f" containerID="ea1fb86af1f4e7c44a74b553e65ed8578eae636a139f6e103fe26a55aa675452" exitCode=0 Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.175880 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mbr2q"] Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.175983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66c3-account-create-update-gzzxt" event={"ID":"ba70b464-3262-4a27-b710-6ec145fc1a8f","Type":"ContainerDied","Data":"ea1fb86af1f4e7c44a74b553e65ed8578eae636a139f6e103fe26a55aa675452"} Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.176228 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" podUID="5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" containerName="dnsmasq-dns" containerID="cri-o://e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d" gracePeriod=10 Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.177740 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.191244 4707 generic.go:334] "Generic (PLEG): container finished" podID="cb2a1269-a03b-4a5f-b61e-ad1f94576f9e" containerID="f5e79da3b8d462ac93c347688474fe44bd456d32086b8d4a5830a90ab4325747" exitCode=0 Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.191356 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0e72-account-create-update-shtrl" event={"ID":"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e","Type":"ContainerDied","Data":"f5e79da3b8d462ac93c347688474fe44bd456d32086b8d4a5830a90ab4325747"} Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.191389 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0e72-account-create-update-shtrl" event={"ID":"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e","Type":"ContainerStarted","Data":"b76411d1acde9ebfbd5409b4aa10bfb6a067aa472d9ed7b804d4419bf7651b97"} Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.193307 4707 generic.go:334] "Generic (PLEG): container finished" podID="3cea5dce-4548-46ba-b461-8c98a63a7daf" containerID="4eb5b45b2d80302bb040d1a4c3b6896389f5ad7a1867de1d8a3b1723b93a8f69" exitCode=0 Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.193571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-brvzb" event={"ID":"3cea5dce-4548-46ba-b461-8c98a63a7daf","Type":"ContainerDied","Data":"4eb5b45b2d80302bb040d1a4c3b6896389f5ad7a1867de1d8a3b1723b93a8f69"} Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.222665 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-pxd9d"] Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.224578 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.253778 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pxd9d"] Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.308989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-config\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.309106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.309165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.309238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-dns-svc\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.309273 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bhlv\" (UniqueName: \"kubernetes.io/projected/12874d9e-78e2-4430-a7e0-7f542dc518c0-kube-api-access-4bhlv\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.420320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.420382 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.420546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-dns-svc\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.420575 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bhlv\" (UniqueName: \"kubernetes.io/projected/12874d9e-78e2-4430-a7e0-7f542dc518c0-kube-api-access-4bhlv\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.420643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-config\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.421835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.421845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-config\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.423289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-dns-svc\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.423349 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.454122 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bhlv\" (UniqueName: \"kubernetes.io/projected/12874d9e-78e2-4430-a7e0-7f542dc518c0-kube-api-access-4bhlv\") pod \"dnsmasq-dns-698758b865-pxd9d\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.659786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.756225 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.830064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-672jm\" (UniqueName: \"kubernetes.io/projected/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-kube-api-access-672jm\") pod \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.830161 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-config\") pod \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.830288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-dns-svc\") pod \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.830341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-ovsdbserver-sb\") pod \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\" (UID: \"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15\") " Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.834989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-kube-api-access-672jm" (OuterVolumeSpecName: "kube-api-access-672jm") pod "5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" (UID: "5cf134f5-aa0d-4ddf-ad13-aaec527fcc15"). InnerVolumeSpecName "kube-api-access-672jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.878460 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-config" (OuterVolumeSpecName: "config") pod "5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" (UID: "5cf134f5-aa0d-4ddf-ad13-aaec527fcc15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.879139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" (UID: "5cf134f5-aa0d-4ddf-ad13-aaec527fcc15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.881094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" (UID: "5cf134f5-aa0d-4ddf-ad13-aaec527fcc15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.932944 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-672jm\" (UniqueName: \"kubernetes.io/projected/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-kube-api-access-672jm\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.933351 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.933363 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:10 crc kubenswrapper[4707]: I0129 03:45:10.933374 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.167854 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pxd9d"] Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.203984 4707 generic.go:334] "Generic (PLEG): container finished" podID="5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" containerID="e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d" exitCode=0 Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.204030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" event={"ID":"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15","Type":"ContainerDied","Data":"e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d"} Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.204090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" event={"ID":"5cf134f5-aa0d-4ddf-ad13-aaec527fcc15","Type":"ContainerDied","Data":"d6e5ffe59ac9ed6c22da79334796a687737164442ed958d233a583a05c55ac87"} Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.204117 4707 scope.go:117] "RemoveContainer" containerID="e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.204119 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-mbr2q" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.205943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pxd9d" event={"ID":"12874d9e-78e2-4430-a7e0-7f542dc518c0","Type":"ContainerStarted","Data":"464a80d47fe8cf6d4c11a0eaf013c85f8604c3f0bad8db13705ece09a6669fc0"} Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.255043 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mbr2q"] Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.262430 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-mbr2q"] Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.272879 4707 scope.go:117] "RemoveContainer" containerID="3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.309894 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 29 03:45:11 crc kubenswrapper[4707]: E0129 03:45:11.310269 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" containerName="init" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.310282 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" containerName="init" Jan 29 03:45:11 crc kubenswrapper[4707]: E0129 03:45:11.310293 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" containerName="dnsmasq-dns" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.310301 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" containerName="dnsmasq-dns" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.310474 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" containerName="dnsmasq-dns" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.316119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.321928 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.322160 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.322279 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.322389 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2rfks" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.339062 4707 scope.go:117] "RemoveContainer" containerID="e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d" Jan 29 03:45:11 crc kubenswrapper[4707]: E0129 03:45:11.341910 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d\": container with ID starting with e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d not found: ID does not exist" containerID="e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.341971 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d"} err="failed to get container status \"e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d\": rpc error: code = NotFound desc = could not find container \"e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d\": container with ID starting with e648dd2569f8d596b802da5752e96da16fe5c4074aab77c7db00759d2fc4b58d not found: ID does not exist" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.342012 4707 scope.go:117] "RemoveContainer" containerID="3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7" Jan 29 03:45:11 crc kubenswrapper[4707]: E0129 03:45:11.342489 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7\": container with ID starting with 3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7 not found: ID does not exist" containerID="3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.342552 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7"} err="failed to get container status \"3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7\": rpc error: code = NotFound desc = could not find container \"3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7\": container with ID starting with 3acaac8edc10ddcbcecdd8572ba3e29d24d593789a785a05ce9ea1390e81fec7 not found: ID does not exist" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.351331 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.447439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/249edadf-1bb4-4d39-aae3-40384ba10bae-cache\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.447927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpl8f\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-kube-api-access-qpl8f\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.448001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249edadf-1bb4-4d39-aae3-40384ba10bae-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.448042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.448070 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.448120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/249edadf-1bb4-4d39-aae3-40384ba10bae-lock\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.550054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpl8f\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-kube-api-access-qpl8f\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.550191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249edadf-1bb4-4d39-aae3-40384ba10bae-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.550248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.550292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.550367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/249edadf-1bb4-4d39-aae3-40384ba10bae-lock\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.550393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/249edadf-1bb4-4d39-aae3-40384ba10bae-cache\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: E0129 03:45:11.550501 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 03:45:11 crc kubenswrapper[4707]: E0129 03:45:11.550529 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 03:45:11 crc kubenswrapper[4707]: E0129 03:45:11.550604 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift podName:249edadf-1bb4-4d39-aae3-40384ba10bae nodeName:}" failed. No retries permitted until 2026-01-29 03:45:12.050577234 +0000 UTC m=+1065.534806139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift") pod "swift-storage-0" (UID: "249edadf-1bb4-4d39-aae3-40384ba10bae") : configmap "swift-ring-files" not found Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.551148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/249edadf-1bb4-4d39-aae3-40384ba10bae-cache\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.551692 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.551917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/249edadf-1bb4-4d39-aae3-40384ba10bae-lock\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.560950 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.561695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249edadf-1bb4-4d39-aae3-40384ba10bae-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.572822 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpl8f\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-kube-api-access-qpl8f\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.585121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.652206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba70b464-3262-4a27-b710-6ec145fc1a8f-operator-scripts\") pod \"ba70b464-3262-4a27-b710-6ec145fc1a8f\" (UID: \"ba70b464-3262-4a27-b710-6ec145fc1a8f\") " Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.652435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tbkt\" (UniqueName: \"kubernetes.io/projected/ba70b464-3262-4a27-b710-6ec145fc1a8f-kube-api-access-5tbkt\") pod \"ba70b464-3262-4a27-b710-6ec145fc1a8f\" (UID: \"ba70b464-3262-4a27-b710-6ec145fc1a8f\") " Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.653336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba70b464-3262-4a27-b710-6ec145fc1a8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba70b464-3262-4a27-b710-6ec145fc1a8f" (UID: "ba70b464-3262-4a27-b710-6ec145fc1a8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.657129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba70b464-3262-4a27-b710-6ec145fc1a8f-kube-api-access-5tbkt" (OuterVolumeSpecName: "kube-api-access-5tbkt") pod "ba70b464-3262-4a27-b710-6ec145fc1a8f" (UID: "ba70b464-3262-4a27-b710-6ec145fc1a8f"). InnerVolumeSpecName "kube-api-access-5tbkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.677734 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.684144 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.754164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-operator-scripts\") pod \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\" (UID: \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\") " Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.754222 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdsx5\" (UniqueName: \"kubernetes.io/projected/c0826415-3b19-4838-b645-1d5e36ba6e16-kube-api-access-qdsx5\") pod \"c0826415-3b19-4838-b645-1d5e36ba6e16\" (UID: \"c0826415-3b19-4838-b645-1d5e36ba6e16\") " Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.754382 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdzc5\" (UniqueName: \"kubernetes.io/projected/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-kube-api-access-xdzc5\") pod \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\" (UID: \"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e\") " Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.754466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0826415-3b19-4838-b645-1d5e36ba6e16-operator-scripts\") pod \"c0826415-3b19-4838-b645-1d5e36ba6e16\" (UID: \"c0826415-3b19-4838-b645-1d5e36ba6e16\") " Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.754723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb2a1269-a03b-4a5f-b61e-ad1f94576f9e" (UID: "cb2a1269-a03b-4a5f-b61e-ad1f94576f9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.755007 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.755172 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba70b464-3262-4a27-b710-6ec145fc1a8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.755189 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tbkt\" (UniqueName: \"kubernetes.io/projected/ba70b464-3262-4a27-b710-6ec145fc1a8f-kube-api-access-5tbkt\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.755074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0826415-3b19-4838-b645-1d5e36ba6e16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0826415-3b19-4838-b645-1d5e36ba6e16" (UID: "c0826415-3b19-4838-b645-1d5e36ba6e16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.757379 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-kube-api-access-xdzc5" (OuterVolumeSpecName: "kube-api-access-xdzc5") pod "cb2a1269-a03b-4a5f-b61e-ad1f94576f9e" (UID: "cb2a1269-a03b-4a5f-b61e-ad1f94576f9e"). InnerVolumeSpecName "kube-api-access-xdzc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.760773 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0826415-3b19-4838-b645-1d5e36ba6e16-kube-api-access-qdsx5" (OuterVolumeSpecName: "kube-api-access-qdsx5") pod "c0826415-3b19-4838-b645-1d5e36ba6e16" (UID: "c0826415-3b19-4838-b645-1d5e36ba6e16"). InnerVolumeSpecName "kube-api-access-qdsx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.762383 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-brvzb" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.856748 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cea5dce-4548-46ba-b461-8c98a63a7daf-operator-scripts\") pod \"3cea5dce-4548-46ba-b461-8c98a63a7daf\" (UID: \"3cea5dce-4548-46ba-b461-8c98a63a7daf\") " Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.856916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxnh8\" (UniqueName: \"kubernetes.io/projected/3cea5dce-4548-46ba-b461-8c98a63a7daf-kube-api-access-gxnh8\") pod \"3cea5dce-4548-46ba-b461-8c98a63a7daf\" (UID: \"3cea5dce-4548-46ba-b461-8c98a63a7daf\") " Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.857262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cea5dce-4548-46ba-b461-8c98a63a7daf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3cea5dce-4548-46ba-b461-8c98a63a7daf" (UID: "3cea5dce-4548-46ba-b461-8c98a63a7daf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.857808 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdsx5\" (UniqueName: \"kubernetes.io/projected/c0826415-3b19-4838-b645-1d5e36ba6e16-kube-api-access-qdsx5\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.857832 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cea5dce-4548-46ba-b461-8c98a63a7daf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.857845 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdzc5\" (UniqueName: \"kubernetes.io/projected/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e-kube-api-access-xdzc5\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.857857 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0826415-3b19-4838-b645-1d5e36ba6e16-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.860865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cea5dce-4548-46ba-b461-8c98a63a7daf-kube-api-access-gxnh8" (OuterVolumeSpecName: "kube-api-access-gxnh8") pod "3cea5dce-4548-46ba-b461-8c98a63a7daf" (UID: "3cea5dce-4548-46ba-b461-8c98a63a7daf"). InnerVolumeSpecName "kube-api-access-gxnh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:11 crc kubenswrapper[4707]: I0129 03:45:11.960452 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxnh8\" (UniqueName: \"kubernetes.io/projected/3cea5dce-4548-46ba-b461-8c98a63a7daf-kube-api-access-gxnh8\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.062473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:12 crc kubenswrapper[4707]: E0129 03:45:12.062875 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 03:45:12 crc kubenswrapper[4707]: E0129 03:45:12.062929 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 03:45:12 crc kubenswrapper[4707]: E0129 03:45:12.063020 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift podName:249edadf-1bb4-4d39-aae3-40384ba10bae nodeName:}" failed. No retries permitted until 2026-01-29 03:45:13.062992131 +0000 UTC m=+1066.547221036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift") pod "swift-storage-0" (UID: "249edadf-1bb4-4d39-aae3-40384ba10bae") : configmap "swift-ring-files" not found Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.217341 4707 generic.go:334] "Generic (PLEG): container finished" podID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerID="6ca1552f4071c6d249ac4cefc6015a26e36b74ca4ea609f9f11eb9e76eafb1f3" exitCode=0 Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.217448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pxd9d" event={"ID":"12874d9e-78e2-4430-a7e0-7f542dc518c0","Type":"ContainerDied","Data":"6ca1552f4071c6d249ac4cefc6015a26e36b74ca4ea609f9f11eb9e76eafb1f3"} Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.219978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-brvzb" event={"ID":"3cea5dce-4548-46ba-b461-8c98a63a7daf","Type":"ContainerDied","Data":"9ed9021ceee769d276715567d24d983c8dc799b20effc73a0caadf90824ccde5"} Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.220008 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed9021ceee769d276715567d24d983c8dc799b20effc73a0caadf90824ccde5" Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.220018 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-brvzb" Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.225833 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hxjqt" Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.225860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hxjqt" event={"ID":"c0826415-3b19-4838-b645-1d5e36ba6e16","Type":"ContainerDied","Data":"f72676f101cadc4fd2c00415e3dcd8ca889486a1e9caf1982d7d22cc2b130428"} Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.225934 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f72676f101cadc4fd2c00415e3dcd8ca889486a1e9caf1982d7d22cc2b130428" Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.229037 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66c3-account-create-update-gzzxt" event={"ID":"ba70b464-3262-4a27-b710-6ec145fc1a8f","Type":"ContainerDied","Data":"64bee9f9512f153c32e636b019db972252fec50143dd474af30933f3bf4d05db"} Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.229090 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64bee9f9512f153c32e636b019db972252fec50143dd474af30933f3bf4d05db" Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.229052 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c3-account-create-update-gzzxt" Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.230577 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0e72-account-create-update-shtrl" Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.230530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0e72-account-create-update-shtrl" event={"ID":"cb2a1269-a03b-4a5f-b61e-ad1f94576f9e","Type":"ContainerDied","Data":"b76411d1acde9ebfbd5409b4aa10bfb6a067aa472d9ed7b804d4419bf7651b97"} Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.230690 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b76411d1acde9ebfbd5409b4aa10bfb6a067aa472d9ed7b804d4419bf7651b97" Jan 29 03:45:12 crc kubenswrapper[4707]: I0129 03:45:12.870996 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.082553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:13 crc kubenswrapper[4707]: E0129 03:45:13.083054 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 03:45:13 crc kubenswrapper[4707]: E0129 03:45:13.083148 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 03:45:13 crc kubenswrapper[4707]: E0129 03:45:13.083322 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift podName:249edadf-1bb4-4d39-aae3-40384ba10bae nodeName:}" failed. No retries permitted until 2026-01-29 03:45:15.083291857 +0000 UTC m=+1068.567520762 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift") pod "swift-storage-0" (UID: "249edadf-1bb4-4d39-aae3-40384ba10bae") : configmap "swift-ring-files" not found Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.264122 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cf134f5-aa0d-4ddf-ad13-aaec527fcc15" path="/var/lib/kubelet/pods/5cf134f5-aa0d-4ddf-ad13-aaec527fcc15/volumes" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.264775 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.264804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pxd9d" event={"ID":"12874d9e-78e2-4430-a7e0-7f542dc518c0","Type":"ContainerStarted","Data":"f6fbfba7fedad77869794e5ddf408d7c9c846b223220661ce99cc66f0abf8153"} Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.281720 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-pxd9d" podStartSLOduration=3.281694972 podStartE2EDuration="3.281694972s" podCreationTimestamp="2026-01-29 03:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:13.272828573 +0000 UTC m=+1066.757057488" watchObservedRunningTime="2026-01-29 03:45:13.281694972 +0000 UTC m=+1066.765923877" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.546049 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2tdqh"] Jan 29 03:45:13 crc kubenswrapper[4707]: E0129 03:45:13.546483 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb2a1269-a03b-4a5f-b61e-ad1f94576f9e" containerName="mariadb-account-create-update" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.546505 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb2a1269-a03b-4a5f-b61e-ad1f94576f9e" containerName="mariadb-account-create-update" Jan 29 03:45:13 crc kubenswrapper[4707]: E0129 03:45:13.546548 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cea5dce-4548-46ba-b461-8c98a63a7daf" containerName="mariadb-database-create" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.546557 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cea5dce-4548-46ba-b461-8c98a63a7daf" containerName="mariadb-database-create" Jan 29 03:45:13 crc kubenswrapper[4707]: E0129 03:45:13.546575 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba70b464-3262-4a27-b710-6ec145fc1a8f" containerName="mariadb-account-create-update" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.546583 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba70b464-3262-4a27-b710-6ec145fc1a8f" containerName="mariadb-account-create-update" Jan 29 03:45:13 crc kubenswrapper[4707]: E0129 03:45:13.546609 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0826415-3b19-4838-b645-1d5e36ba6e16" containerName="mariadb-database-create" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.546616 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0826415-3b19-4838-b645-1d5e36ba6e16" containerName="mariadb-database-create" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.546818 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cea5dce-4548-46ba-b461-8c98a63a7daf" containerName="mariadb-database-create" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.546842 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba70b464-3262-4a27-b710-6ec145fc1a8f" containerName="mariadb-account-create-update" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.546855 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb2a1269-a03b-4a5f-b61e-ad1f94576f9e" containerName="mariadb-account-create-update" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.546868 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0826415-3b19-4838-b645-1d5e36ba6e16" containerName="mariadb-database-create" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.547483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.591971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88180d5b-94a0-4a62-9d77-5679c91e5d07-operator-scripts\") pod \"glance-db-create-2tdqh\" (UID: \"88180d5b-94a0-4a62-9d77-5679c91e5d07\") " pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.592077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5bzs\" (UniqueName: \"kubernetes.io/projected/88180d5b-94a0-4a62-9d77-5679c91e5d07-kube-api-access-f5bzs\") pod \"glance-db-create-2tdqh\" (UID: \"88180d5b-94a0-4a62-9d77-5679c91e5d07\") " pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.593330 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f123-account-create-update-9n45l"] Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.594504 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.597894 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.606544 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2tdqh"] Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.616683 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f123-account-create-update-9n45l"] Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.694783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88180d5b-94a0-4a62-9d77-5679c91e5d07-operator-scripts\") pod \"glance-db-create-2tdqh\" (UID: \"88180d5b-94a0-4a62-9d77-5679c91e5d07\") " pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.694859 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdac9d85-fc1b-4556-8d11-6d43a6b24753-operator-scripts\") pod \"glance-f123-account-create-update-9n45l\" (UID: \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\") " pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.694891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5bzs\" (UniqueName: \"kubernetes.io/projected/88180d5b-94a0-4a62-9d77-5679c91e5d07-kube-api-access-f5bzs\") pod \"glance-db-create-2tdqh\" (UID: \"88180d5b-94a0-4a62-9d77-5679c91e5d07\") " pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.694947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sdq2\" (UniqueName: \"kubernetes.io/projected/cdac9d85-fc1b-4556-8d11-6d43a6b24753-kube-api-access-6sdq2\") pod \"glance-f123-account-create-update-9n45l\" (UID: \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\") " pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.697193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88180d5b-94a0-4a62-9d77-5679c91e5d07-operator-scripts\") pod \"glance-db-create-2tdqh\" (UID: \"88180d5b-94a0-4a62-9d77-5679c91e5d07\") " pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.723221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5bzs\" (UniqueName: \"kubernetes.io/projected/88180d5b-94a0-4a62-9d77-5679c91e5d07-kube-api-access-f5bzs\") pod \"glance-db-create-2tdqh\" (UID: \"88180d5b-94a0-4a62-9d77-5679c91e5d07\") " pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.797092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdac9d85-fc1b-4556-8d11-6d43a6b24753-operator-scripts\") pod \"glance-f123-account-create-update-9n45l\" (UID: \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\") " pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.797698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sdq2\" (UniqueName: \"kubernetes.io/projected/cdac9d85-fc1b-4556-8d11-6d43a6b24753-kube-api-access-6sdq2\") pod \"glance-f123-account-create-update-9n45l\" (UID: \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\") " pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.798285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdac9d85-fc1b-4556-8d11-6d43a6b24753-operator-scripts\") pod \"glance-f123-account-create-update-9n45l\" (UID: \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\") " pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.819431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sdq2\" (UniqueName: \"kubernetes.io/projected/cdac9d85-fc1b-4556-8d11-6d43a6b24753-kube-api-access-6sdq2\") pod \"glance-f123-account-create-update-9n45l\" (UID: \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\") " pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.871446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:13 crc kubenswrapper[4707]: I0129 03:45:13.930576 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:14 crc kubenswrapper[4707]: I0129 03:45:14.385310 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2tdqh"] Jan 29 03:45:14 crc kubenswrapper[4707]: W0129 03:45:14.386518 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88180d5b_94a0_4a62_9d77_5679c91e5d07.slice/crio-eb8b26a6560614a526b9dd72ce8789d77e289afd25dc23bf210231559160f31b WatchSource:0}: Error finding container eb8b26a6560614a526b9dd72ce8789d77e289afd25dc23bf210231559160f31b: Status 404 returned error can't find the container with id eb8b26a6560614a526b9dd72ce8789d77e289afd25dc23bf210231559160f31b Jan 29 03:45:14 crc kubenswrapper[4707]: I0129 03:45:14.485049 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f123-account-create-update-9n45l"] Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.126182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:15 crc kubenswrapper[4707]: E0129 03:45:15.126426 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 03:45:15 crc kubenswrapper[4707]: E0129 03:45:15.126841 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 03:45:15 crc kubenswrapper[4707]: E0129 03:45:15.126921 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift podName:249edadf-1bb4-4d39-aae3-40384ba10bae nodeName:}" failed. No retries permitted until 2026-01-29 03:45:19.126894554 +0000 UTC m=+1072.611123459 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift") pod "swift-storage-0" (UID: "249edadf-1bb4-4d39-aae3-40384ba10bae") : configmap "swift-ring-files" not found Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.257840 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bxs5q"] Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.265987 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.267919 4707 generic.go:334] "Generic (PLEG): container finished" podID="cdac9d85-fc1b-4556-8d11-6d43a6b24753" containerID="1006064bb35e480fe9e60e724004e172a59ea17a7fdbbab215838f92c2a15d3a" exitCode=0 Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.268088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f123-account-create-update-9n45l" event={"ID":"cdac9d85-fc1b-4556-8d11-6d43a6b24753","Type":"ContainerDied","Data":"1006064bb35e480fe9e60e724004e172a59ea17a7fdbbab215838f92c2a15d3a"} Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.268240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f123-account-create-update-9n45l" event={"ID":"cdac9d85-fc1b-4556-8d11-6d43a6b24753","Type":"ContainerStarted","Data":"d0bfbcdfc627b59fbf803f1d80765e87bd16144f20fba640540775e053a91e45"} Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.268422 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bxs5q"] Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.269741 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.270134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.270785 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.274166 4707 generic.go:334] "Generic (PLEG): container finished" podID="88180d5b-94a0-4a62-9d77-5679c91e5d07" containerID="606e86b23eecaebc92447482e60b7bb8b696cbfcbe09dcba491d207262bf4b4a" exitCode=0 Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.274232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2tdqh" event={"ID":"88180d5b-94a0-4a62-9d77-5679c91e5d07","Type":"ContainerDied","Data":"606e86b23eecaebc92447482e60b7bb8b696cbfcbe09dcba491d207262bf4b4a"} Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.274270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2tdqh" event={"ID":"88180d5b-94a0-4a62-9d77-5679c91e5d07","Type":"ContainerStarted","Data":"eb8b26a6560614a526b9dd72ce8789d77e289afd25dc23bf210231559160f31b"} Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.331705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9e5582d-dd71-4ccd-84ea-bc133dce917c-etc-swift\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.331815 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-scripts\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.331841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-dispersionconf\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.331871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-combined-ca-bundle\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.331906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-ring-data-devices\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.331960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-swiftconf\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.332007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpvlq\" (UniqueName: \"kubernetes.io/projected/b9e5582d-dd71-4ccd-84ea-bc133dce917c-kube-api-access-wpvlq\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.433062 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-4hn2t"] Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.433348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-swiftconf\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.434408 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.435380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpvlq\" (UniqueName: \"kubernetes.io/projected/b9e5582d-dd71-4ccd-84ea-bc133dce917c-kube-api-access-wpvlq\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.435439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9e5582d-dd71-4ccd-84ea-bc133dce917c-etc-swift\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.435496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-scripts\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.435528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-dispersionconf\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.435593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-combined-ca-bundle\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.435638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-ring-data-devices\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.436195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9e5582d-dd71-4ccd-84ea-bc133dce917c-etc-swift\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.439473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-ring-data-devices\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.440980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-scripts\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.441050 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.445637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-combined-ca-bundle\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.454097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-dispersionconf\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.457753 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4hn2t"] Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.460029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-swiftconf\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.468350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpvlq\" (UniqueName: \"kubernetes.io/projected/b9e5582d-dd71-4ccd-84ea-bc133dce917c-kube-api-access-wpvlq\") pod \"swift-ring-rebalance-bxs5q\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.537641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfp7d\" (UniqueName: \"kubernetes.io/projected/75870b48-35b8-4667-8f64-76461736064f-kube-api-access-sfp7d\") pod \"root-account-create-update-4hn2t\" (UID: \"75870b48-35b8-4667-8f64-76461736064f\") " pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.538582 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75870b48-35b8-4667-8f64-76461736064f-operator-scripts\") pod \"root-account-create-update-4hn2t\" (UID: \"75870b48-35b8-4667-8f64-76461736064f\") " pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.600472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.640951 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75870b48-35b8-4667-8f64-76461736064f-operator-scripts\") pod \"root-account-create-update-4hn2t\" (UID: \"75870b48-35b8-4667-8f64-76461736064f\") " pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.641062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfp7d\" (UniqueName: \"kubernetes.io/projected/75870b48-35b8-4667-8f64-76461736064f-kube-api-access-sfp7d\") pod \"root-account-create-update-4hn2t\" (UID: \"75870b48-35b8-4667-8f64-76461736064f\") " pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.642214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75870b48-35b8-4667-8f64-76461736064f-operator-scripts\") pod \"root-account-create-update-4hn2t\" (UID: \"75870b48-35b8-4667-8f64-76461736064f\") " pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.659719 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfp7d\" (UniqueName: \"kubernetes.io/projected/75870b48-35b8-4667-8f64-76461736064f-kube-api-access-sfp7d\") pod \"root-account-create-update-4hn2t\" (UID: \"75870b48-35b8-4667-8f64-76461736064f\") " pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:15 crc kubenswrapper[4707]: I0129 03:45:15.815645 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.083041 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bxs5q"] Jan 29 03:45:16 crc kubenswrapper[4707]: W0129 03:45:16.086676 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9e5582d_dd71_4ccd_84ea_bc133dce917c.slice/crio-fb9334a8edfb0b8e1cda6eef7af057d18b672dd739bb8d8be9f8f8d52b2a2b8b WatchSource:0}: Error finding container fb9334a8edfb0b8e1cda6eef7af057d18b672dd739bb8d8be9f8f8d52b2a2b8b: Status 404 returned error can't find the container with id fb9334a8edfb0b8e1cda6eef7af057d18b672dd739bb8d8be9f8f8d52b2a2b8b Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.288479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bxs5q" event={"ID":"b9e5582d-dd71-4ccd-84ea-bc133dce917c","Type":"ContainerStarted","Data":"fb9334a8edfb0b8e1cda6eef7af057d18b672dd739bb8d8be9f8f8d52b2a2b8b"} Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.303490 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4hn2t"] Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.648414 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.702871 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.768439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdac9d85-fc1b-4556-8d11-6d43a6b24753-operator-scripts\") pod \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\" (UID: \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\") " Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.768501 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sdq2\" (UniqueName: \"kubernetes.io/projected/cdac9d85-fc1b-4556-8d11-6d43a6b24753-kube-api-access-6sdq2\") pod \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\" (UID: \"cdac9d85-fc1b-4556-8d11-6d43a6b24753\") " Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.768551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88180d5b-94a0-4a62-9d77-5679c91e5d07-operator-scripts\") pod \"88180d5b-94a0-4a62-9d77-5679c91e5d07\" (UID: \"88180d5b-94a0-4a62-9d77-5679c91e5d07\") " Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.768670 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5bzs\" (UniqueName: \"kubernetes.io/projected/88180d5b-94a0-4a62-9d77-5679c91e5d07-kube-api-access-f5bzs\") pod \"88180d5b-94a0-4a62-9d77-5679c91e5d07\" (UID: \"88180d5b-94a0-4a62-9d77-5679c91e5d07\") " Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.769504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdac9d85-fc1b-4556-8d11-6d43a6b24753-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cdac9d85-fc1b-4556-8d11-6d43a6b24753" (UID: "cdac9d85-fc1b-4556-8d11-6d43a6b24753"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.769504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88180d5b-94a0-4a62-9d77-5679c91e5d07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88180d5b-94a0-4a62-9d77-5679c91e5d07" (UID: "88180d5b-94a0-4a62-9d77-5679c91e5d07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.776329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdac9d85-fc1b-4556-8d11-6d43a6b24753-kube-api-access-6sdq2" (OuterVolumeSpecName: "kube-api-access-6sdq2") pod "cdac9d85-fc1b-4556-8d11-6d43a6b24753" (UID: "cdac9d85-fc1b-4556-8d11-6d43a6b24753"). InnerVolumeSpecName "kube-api-access-6sdq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.776399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88180d5b-94a0-4a62-9d77-5679c91e5d07-kube-api-access-f5bzs" (OuterVolumeSpecName: "kube-api-access-f5bzs") pod "88180d5b-94a0-4a62-9d77-5679c91e5d07" (UID: "88180d5b-94a0-4a62-9d77-5679c91e5d07"). InnerVolumeSpecName "kube-api-access-f5bzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.870660 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5bzs\" (UniqueName: \"kubernetes.io/projected/88180d5b-94a0-4a62-9d77-5679c91e5d07-kube-api-access-f5bzs\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.870701 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdac9d85-fc1b-4556-8d11-6d43a6b24753-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.870718 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sdq2\" (UniqueName: \"kubernetes.io/projected/cdac9d85-fc1b-4556-8d11-6d43a6b24753-kube-api-access-6sdq2\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:16 crc kubenswrapper[4707]: I0129 03:45:16.870730 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88180d5b-94a0-4a62-9d77-5679c91e5d07-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:17 crc kubenswrapper[4707]: I0129 03:45:17.305211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2tdqh" event={"ID":"88180d5b-94a0-4a62-9d77-5679c91e5d07","Type":"ContainerDied","Data":"eb8b26a6560614a526b9dd72ce8789d77e289afd25dc23bf210231559160f31b"} Jan 29 03:45:17 crc kubenswrapper[4707]: I0129 03:45:17.305700 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb8b26a6560614a526b9dd72ce8789d77e289afd25dc23bf210231559160f31b" Jan 29 03:45:17 crc kubenswrapper[4707]: I0129 03:45:17.305238 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2tdqh" Jan 29 03:45:17 crc kubenswrapper[4707]: I0129 03:45:17.309458 4707 generic.go:334] "Generic (PLEG): container finished" podID="75870b48-35b8-4667-8f64-76461736064f" containerID="8a2b86e2615e61784ad39188cb18d1b8e30c792668a4fd33763356ee7f7d4260" exitCode=0 Jan 29 03:45:17 crc kubenswrapper[4707]: I0129 03:45:17.309515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4hn2t" event={"ID":"75870b48-35b8-4667-8f64-76461736064f","Type":"ContainerDied","Data":"8a2b86e2615e61784ad39188cb18d1b8e30c792668a4fd33763356ee7f7d4260"} Jan 29 03:45:17 crc kubenswrapper[4707]: I0129 03:45:17.309592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4hn2t" event={"ID":"75870b48-35b8-4667-8f64-76461736064f","Type":"ContainerStarted","Data":"54f5dbc3ee18d6ea9d4f91b8297e40c5114688a5a272eb9377891f8124a2b86d"} Jan 29 03:45:17 crc kubenswrapper[4707]: I0129 03:45:17.313464 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f123-account-create-update-9n45l" event={"ID":"cdac9d85-fc1b-4556-8d11-6d43a6b24753","Type":"ContainerDied","Data":"d0bfbcdfc627b59fbf803f1d80765e87bd16144f20fba640540775e053a91e45"} Jan 29 03:45:17 crc kubenswrapper[4707]: I0129 03:45:17.313520 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0bfbcdfc627b59fbf803f1d80765e87bd16144f20fba640540775e053a91e45" Jan 29 03:45:17 crc kubenswrapper[4707]: I0129 03:45:17.313547 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f123-account-create-update-9n45l" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.795419 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-m74xr"] Jan 29 03:45:18 crc kubenswrapper[4707]: E0129 03:45:18.796326 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88180d5b-94a0-4a62-9d77-5679c91e5d07" containerName="mariadb-database-create" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.796356 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="88180d5b-94a0-4a62-9d77-5679c91e5d07" containerName="mariadb-database-create" Jan 29 03:45:18 crc kubenswrapper[4707]: E0129 03:45:18.796380 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdac9d85-fc1b-4556-8d11-6d43a6b24753" containerName="mariadb-account-create-update" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.796386 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdac9d85-fc1b-4556-8d11-6d43a6b24753" containerName="mariadb-account-create-update" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.796604 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdac9d85-fc1b-4556-8d11-6d43a6b24753" containerName="mariadb-account-create-update" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.796622 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="88180d5b-94a0-4a62-9d77-5679c91e5d07" containerName="mariadb-database-create" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.797195 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.799298 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.799496 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tqq24" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.819327 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m74xr"] Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.922348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-combined-ca-bundle\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.922901 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82t66\" (UniqueName: \"kubernetes.io/projected/d65e0ca5-1e58-4492-bd8d-92ff6d516014-kube-api-access-82t66\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.922987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-db-sync-config-data\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:18 crc kubenswrapper[4707]: I0129 03:45:18.923045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-config-data\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.026392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-combined-ca-bundle\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.026457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82t66\" (UniqueName: \"kubernetes.io/projected/d65e0ca5-1e58-4492-bd8d-92ff6d516014-kube-api-access-82t66\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.026667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-db-sync-config-data\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.026729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-config-data\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.034180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-db-sync-config-data\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.035507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-config-data\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.035767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-combined-ca-bundle\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.048596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82t66\" (UniqueName: \"kubernetes.io/projected/d65e0ca5-1e58-4492-bd8d-92ff6d516014-kube-api-access-82t66\") pod \"glance-db-sync-m74xr\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.128317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:19 crc kubenswrapper[4707]: E0129 03:45:19.128527 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 03:45:19 crc kubenswrapper[4707]: E0129 03:45:19.128793 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 03:45:19 crc kubenswrapper[4707]: E0129 03:45:19.128856 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift podName:249edadf-1bb4-4d39-aae3-40384ba10bae nodeName:}" failed. No retries permitted until 2026-01-29 03:45:27.128829922 +0000 UTC m=+1080.613058827 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift") pod "swift-storage-0" (UID: "249edadf-1bb4-4d39-aae3-40384ba10bae") : configmap "swift-ring-files" not found Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.128872 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.346917 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6e14cde-a343-4dc3-b429-77968ac0b7a5" containerID="6503da289d12f1bfc63a47f7d284f1b2b501095f2d4223347f4b591fe440389f" exitCode=0 Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.347043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6e14cde-a343-4dc3-b429-77968ac0b7a5","Type":"ContainerDied","Data":"6503da289d12f1bfc63a47f7d284f1b2b501095f2d4223347f4b591fe440389f"} Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.350512 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8dec80d-f976-4316-9d4a-c18cbefe36ba" containerID="6d5d6d2e0fa06b8db3118e28e603b5b03f428a0b7fde78b5b540f4727f0a499a" exitCode=0 Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.350570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8dec80d-f976-4316-9d4a-c18cbefe36ba","Type":"ContainerDied","Data":"6d5d6d2e0fa06b8db3118e28e603b5b03f428a0b7fde78b5b540f4727f0a499a"} Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.769921 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.844862 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75870b48-35b8-4667-8f64-76461736064f-operator-scripts\") pod \"75870b48-35b8-4667-8f64-76461736064f\" (UID: \"75870b48-35b8-4667-8f64-76461736064f\") " Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.845044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfp7d\" (UniqueName: \"kubernetes.io/projected/75870b48-35b8-4667-8f64-76461736064f-kube-api-access-sfp7d\") pod \"75870b48-35b8-4667-8f64-76461736064f\" (UID: \"75870b48-35b8-4667-8f64-76461736064f\") " Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.846114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75870b48-35b8-4667-8f64-76461736064f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75870b48-35b8-4667-8f64-76461736064f" (UID: "75870b48-35b8-4667-8f64-76461736064f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.850777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75870b48-35b8-4667-8f64-76461736064f-kube-api-access-sfp7d" (OuterVolumeSpecName: "kube-api-access-sfp7d") pod "75870b48-35b8-4667-8f64-76461736064f" (UID: "75870b48-35b8-4667-8f64-76461736064f"). InnerVolumeSpecName "kube-api-access-sfp7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.947635 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfp7d\" (UniqueName: \"kubernetes.io/projected/75870b48-35b8-4667-8f64-76461736064f-kube-api-access-sfp7d\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:19 crc kubenswrapper[4707]: I0129 03:45:19.947686 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75870b48-35b8-4667-8f64-76461736064f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.106847 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m74xr"] Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.364903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4hn2t" event={"ID":"75870b48-35b8-4667-8f64-76461736064f","Type":"ContainerDied","Data":"54f5dbc3ee18d6ea9d4f91b8297e40c5114688a5a272eb9377891f8124a2b86d"} Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.365195 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54f5dbc3ee18d6ea9d4f91b8297e40c5114688a5a272eb9377891f8124a2b86d" Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.365394 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4hn2t" Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.375697 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m74xr" event={"ID":"d65e0ca5-1e58-4492-bd8d-92ff6d516014","Type":"ContainerStarted","Data":"ba189fd8fd2c4bf629df91d397dbbaecdb9387cdb5700245c24f9169e7bd9554"} Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.378315 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6e14cde-a343-4dc3-b429-77968ac0b7a5","Type":"ContainerStarted","Data":"e4a63754624f94ee910cfd791357cb62838ed5eec49488f409edb3aaf62a64ac"} Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.378609 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.381840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8dec80d-f976-4316-9d4a-c18cbefe36ba","Type":"ContainerStarted","Data":"06e5b149d2301f0677cec170ad9b62ae8d43974756045dd69ee1c1c44a53baba"} Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.382366 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.388226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bxs5q" event={"ID":"b9e5582d-dd71-4ccd-84ea-bc133dce917c","Type":"ContainerStarted","Data":"af9b115593a9d6899f7578887b411edbf17e46eae2e4c50e30f0365bb7748766"} Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.415571 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.838437259 podStartE2EDuration="57.415518472s" podCreationTimestamp="2026-01-29 03:44:23 +0000 UTC" firstStartedPulling="2026-01-29 03:44:26.160493853 +0000 UTC m=+1019.644722758" lastFinishedPulling="2026-01-29 03:44:45.737575066 +0000 UTC m=+1039.221803971" observedRunningTime="2026-01-29 03:45:20.408410892 +0000 UTC m=+1073.892639827" watchObservedRunningTime="2026-01-29 03:45:20.415518472 +0000 UTC m=+1073.899747377" Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.448808 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.072195256 podStartE2EDuration="56.448778636s" podCreationTimestamp="2026-01-29 03:44:24 +0000 UTC" firstStartedPulling="2026-01-29 03:44:26.358235479 +0000 UTC m=+1019.842464384" lastFinishedPulling="2026-01-29 03:44:45.734818859 +0000 UTC m=+1039.219047764" observedRunningTime="2026-01-29 03:45:20.441674447 +0000 UTC m=+1073.925903392" watchObservedRunningTime="2026-01-29 03:45:20.448778636 +0000 UTC m=+1073.933007541" Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.474168 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-bxs5q" podStartSLOduration=1.921994088 podStartE2EDuration="5.474126239s" podCreationTimestamp="2026-01-29 03:45:15 +0000 UTC" firstStartedPulling="2026-01-29 03:45:16.089674044 +0000 UTC m=+1069.573902949" lastFinishedPulling="2026-01-29 03:45:19.641806175 +0000 UTC m=+1073.126035100" observedRunningTime="2026-01-29 03:45:20.467895734 +0000 UTC m=+1073.952124659" watchObservedRunningTime="2026-01-29 03:45:20.474126239 +0000 UTC m=+1073.958355154" Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.661907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.741168 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vm5f5"] Jan 29 03:45:20 crc kubenswrapper[4707]: I0129 03:45:20.741582 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" podUID="1a97493c-cef6-4cf6-8c16-76864d7c24cc" containerName="dnsmasq-dns" containerID="cri-o://5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd" gracePeriod=10 Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.231011 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.275351 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-nb\") pod \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.275688 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-dns-svc\") pod \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.275801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q4ff\" (UniqueName: \"kubernetes.io/projected/1a97493c-cef6-4cf6-8c16-76864d7c24cc-kube-api-access-2q4ff\") pod \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.275887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-sb\") pod \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.275970 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-config\") pod \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\" (UID: \"1a97493c-cef6-4cf6-8c16-76864d7c24cc\") " Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.290929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a97493c-cef6-4cf6-8c16-76864d7c24cc-kube-api-access-2q4ff" (OuterVolumeSpecName: "kube-api-access-2q4ff") pod "1a97493c-cef6-4cf6-8c16-76864d7c24cc" (UID: "1a97493c-cef6-4cf6-8c16-76864d7c24cc"). InnerVolumeSpecName "kube-api-access-2q4ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.318143 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a97493c-cef6-4cf6-8c16-76864d7c24cc" (UID: "1a97493c-cef6-4cf6-8c16-76864d7c24cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.332321 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a97493c-cef6-4cf6-8c16-76864d7c24cc" (UID: "1a97493c-cef6-4cf6-8c16-76864d7c24cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.332517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a97493c-cef6-4cf6-8c16-76864d7c24cc" (UID: "1a97493c-cef6-4cf6-8c16-76864d7c24cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.347690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-config" (OuterVolumeSpecName: "config") pod "1a97493c-cef6-4cf6-8c16-76864d7c24cc" (UID: "1a97493c-cef6-4cf6-8c16-76864d7c24cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.377879 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.377909 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q4ff\" (UniqueName: \"kubernetes.io/projected/1a97493c-cef6-4cf6-8c16-76864d7c24cc-kube-api-access-2q4ff\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.377921 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.377931 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.377940 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a97493c-cef6-4cf6-8c16-76864d7c24cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.412319 4707 generic.go:334] "Generic (PLEG): container finished" podID="1a97493c-cef6-4cf6-8c16-76864d7c24cc" containerID="5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd" exitCode=0 Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.412909 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.413666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" event={"ID":"1a97493c-cef6-4cf6-8c16-76864d7c24cc","Type":"ContainerDied","Data":"5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd"} Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.413782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-vm5f5" event={"ID":"1a97493c-cef6-4cf6-8c16-76864d7c24cc","Type":"ContainerDied","Data":"115b4f6808c9e902ceb345171f5ec9560914cd26c8554282d95cbc4f12003340"} Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.413844 4707 scope.go:117] "RemoveContainer" containerID="5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.450271 4707 scope.go:117] "RemoveContainer" containerID="16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.457167 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vm5f5"] Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.463749 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-vm5f5"] Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.484052 4707 scope.go:117] "RemoveContainer" containerID="5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd" Jan 29 03:45:21 crc kubenswrapper[4707]: E0129 03:45:21.487064 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd\": container with ID starting with 5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd not found: ID does not exist" containerID="5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.487240 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd"} err="failed to get container status \"5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd\": rpc error: code = NotFound desc = could not find container \"5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd\": container with ID starting with 5ee6a4a6bc87142ce3d300e6c87936a1bfb2a78940c3878809f2c906ac3f1efd not found: ID does not exist" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.487343 4707 scope.go:117] "RemoveContainer" containerID="16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468" Jan 29 03:45:21 crc kubenswrapper[4707]: E0129 03:45:21.488446 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468\": container with ID starting with 16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468 not found: ID does not exist" containerID="16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.488516 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468"} err="failed to get container status \"16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468\": rpc error: code = NotFound desc = could not find container \"16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468\": container with ID starting with 16befb31607e9d70fbeec4ede84b72501c8d7ffe39af05a8e2bb2c667faef468 not found: ID does not exist" Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.880223 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4hn2t"] Jan 29 03:45:21 crc kubenswrapper[4707]: I0129 03:45:21.887240 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-4hn2t"] Jan 29 03:45:22 crc kubenswrapper[4707]: I0129 03:45:22.802876 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 29 03:45:23 crc kubenswrapper[4707]: I0129 03:45:23.258440 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a97493c-cef6-4cf6-8c16-76864d7c24cc" path="/var/lib/kubelet/pods/1a97493c-cef6-4cf6-8c16-76864d7c24cc/volumes" Jan 29 03:45:23 crc kubenswrapper[4707]: I0129 03:45:23.259164 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75870b48-35b8-4667-8f64-76461736064f" path="/var/lib/kubelet/pods/75870b48-35b8-4667-8f64-76461736064f/volumes" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.473556 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-v9z9m"] Jan 29 03:45:25 crc kubenswrapper[4707]: E0129 03:45:25.474396 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a97493c-cef6-4cf6-8c16-76864d7c24cc" containerName="dnsmasq-dns" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.474415 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a97493c-cef6-4cf6-8c16-76864d7c24cc" containerName="dnsmasq-dns" Jan 29 03:45:25 crc kubenswrapper[4707]: E0129 03:45:25.474436 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a97493c-cef6-4cf6-8c16-76864d7c24cc" containerName="init" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.474444 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a97493c-cef6-4cf6-8c16-76864d7c24cc" containerName="init" Jan 29 03:45:25 crc kubenswrapper[4707]: E0129 03:45:25.474464 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75870b48-35b8-4667-8f64-76461736064f" containerName="mariadb-account-create-update" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.474474 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75870b48-35b8-4667-8f64-76461736064f" containerName="mariadb-account-create-update" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.474664 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a97493c-cef6-4cf6-8c16-76864d7c24cc" containerName="dnsmasq-dns" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.474682 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75870b48-35b8-4667-8f64-76461736064f" containerName="mariadb-account-create-update" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.475260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.485559 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.492814 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-v9z9m"] Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.588091 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc2eede-643f-4839-a293-5211e57c59ed-operator-scripts\") pod \"root-account-create-update-v9z9m\" (UID: \"1cc2eede-643f-4839-a293-5211e57c59ed\") " pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.588187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l88ft\" (UniqueName: \"kubernetes.io/projected/1cc2eede-643f-4839-a293-5211e57c59ed-kube-api-access-l88ft\") pod \"root-account-create-update-v9z9m\" (UID: \"1cc2eede-643f-4839-a293-5211e57c59ed\") " pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.690703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc2eede-643f-4839-a293-5211e57c59ed-operator-scripts\") pod \"root-account-create-update-v9z9m\" (UID: \"1cc2eede-643f-4839-a293-5211e57c59ed\") " pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.690815 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l88ft\" (UniqueName: \"kubernetes.io/projected/1cc2eede-643f-4839-a293-5211e57c59ed-kube-api-access-l88ft\") pod \"root-account-create-update-v9z9m\" (UID: \"1cc2eede-643f-4839-a293-5211e57c59ed\") " pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.691685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc2eede-643f-4839-a293-5211e57c59ed-operator-scripts\") pod \"root-account-create-update-v9z9m\" (UID: \"1cc2eede-643f-4839-a293-5211e57c59ed\") " pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.711999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l88ft\" (UniqueName: \"kubernetes.io/projected/1cc2eede-643f-4839-a293-5211e57c59ed-kube-api-access-l88ft\") pod \"root-account-create-update-v9z9m\" (UID: \"1cc2eede-643f-4839-a293-5211e57c59ed\") " pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:25 crc kubenswrapper[4707]: I0129 03:45:25.796672 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:26 crc kubenswrapper[4707]: I0129 03:45:26.365122 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-v9z9m"] Jan 29 03:45:27 crc kubenswrapper[4707]: I0129 03:45:27.222967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:27 crc kubenswrapper[4707]: I0129 03:45:27.236762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/249edadf-1bb4-4d39-aae3-40384ba10bae-etc-swift\") pod \"swift-storage-0\" (UID: \"249edadf-1bb4-4d39-aae3-40384ba10bae\") " pod="openstack/swift-storage-0" Jan 29 03:45:27 crc kubenswrapper[4707]: I0129 03:45:27.294079 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 29 03:45:27 crc kubenswrapper[4707]: I0129 03:45:27.478521 4707 generic.go:334] "Generic (PLEG): container finished" podID="b9e5582d-dd71-4ccd-84ea-bc133dce917c" containerID="af9b115593a9d6899f7578887b411edbf17e46eae2e4c50e30f0365bb7748766" exitCode=0 Jan 29 03:45:27 crc kubenswrapper[4707]: I0129 03:45:27.478583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bxs5q" event={"ID":"b9e5582d-dd71-4ccd-84ea-bc133dce917c","Type":"ContainerDied","Data":"af9b115593a9d6899f7578887b411edbf17e46eae2e4c50e30f0365bb7748766"} Jan 29 03:45:28 crc kubenswrapper[4707]: I0129 03:45:28.638860 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hpq5q" podUID="9f831116-140a-4c6b-8d7c-aad99fcaf97c" containerName="ovn-controller" probeResult="failure" output=< Jan 29 03:45:28 crc kubenswrapper[4707]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 29 03:45:28 crc kubenswrapper[4707]: > Jan 29 03:45:28 crc kubenswrapper[4707]: I0129 03:45:28.654420 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:45:28 crc kubenswrapper[4707]: I0129 03:45:28.666234 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hxz2d" Jan 29 03:45:28 crc kubenswrapper[4707]: I0129 03:45:28.920506 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hpq5q-config-kv49j"] Jan 29 03:45:28 crc kubenswrapper[4707]: I0129 03:45:28.922029 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:28 crc kubenswrapper[4707]: I0129 03:45:28.925708 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 29 03:45:28 crc kubenswrapper[4707]: I0129 03:45:28.935466 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hpq5q-config-kv49j"] Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.068048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-additional-scripts\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.068155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.068217 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-scripts\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.068243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-log-ovn\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.068294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6pxt\" (UniqueName: \"kubernetes.io/projected/0afd1e99-88b4-481e-9af4-8a2eac081dea-kube-api-access-n6pxt\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.068340 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run-ovn\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.170239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6pxt\" (UniqueName: \"kubernetes.io/projected/0afd1e99-88b4-481e-9af4-8a2eac081dea-kube-api-access-n6pxt\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.170301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run-ovn\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.170340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-additional-scripts\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.170394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.170443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-scripts\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.170459 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-log-ovn\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.170738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-log-ovn\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.170744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run-ovn\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.170801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.171517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-additional-scripts\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.174405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-scripts\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.200684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6pxt\" (UniqueName: \"kubernetes.io/projected/0afd1e99-88b4-481e-9af4-8a2eac081dea-kube-api-access-n6pxt\") pod \"ovn-controller-hpq5q-config-kv49j\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:29 crc kubenswrapper[4707]: I0129 03:45:29.259484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:33 crc kubenswrapper[4707]: W0129 03:45:33.073447 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cc2eede_643f_4839_a293_5211e57c59ed.slice/crio-5ef4ee646dcb8a28f03bc0a5068ead5bb4540e2ff3875063a26ee2312e11228f WatchSource:0}: Error finding container 5ef4ee646dcb8a28f03bc0a5068ead5bb4540e2ff3875063a26ee2312e11228f: Status 404 returned error can't find the container with id 5ef4ee646dcb8a28f03bc0a5068ead5bb4540e2ff3875063a26ee2312e11228f Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.156267 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.386158 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.462925 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.463445 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.468990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9e5582d-dd71-4ccd-84ea-bc133dce917c-etc-swift\") pod \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.469060 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-combined-ca-bundle\") pod \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.469109 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-ring-data-devices\") pod \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.469137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-scripts\") pod \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.469226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-dispersionconf\") pod \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.469282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpvlq\" (UniqueName: \"kubernetes.io/projected/b9e5582d-dd71-4ccd-84ea-bc133dce917c-kube-api-access-wpvlq\") pod \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.469312 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-swiftconf\") pod \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\" (UID: \"b9e5582d-dd71-4ccd-84ea-bc133dce917c\") " Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.471150 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b9e5582d-dd71-4ccd-84ea-bc133dce917c" (UID: "b9e5582d-dd71-4ccd-84ea-bc133dce917c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.472526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9e5582d-dd71-4ccd-84ea-bc133dce917c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b9e5582d-dd71-4ccd-84ea-bc133dce917c" (UID: "b9e5582d-dd71-4ccd-84ea-bc133dce917c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.477448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e5582d-dd71-4ccd-84ea-bc133dce917c-kube-api-access-wpvlq" (OuterVolumeSpecName: "kube-api-access-wpvlq") pod "b9e5582d-dd71-4ccd-84ea-bc133dce917c" (UID: "b9e5582d-dd71-4ccd-84ea-bc133dce917c"). InnerVolumeSpecName "kube-api-access-wpvlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.486027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b9e5582d-dd71-4ccd-84ea-bc133dce917c" (UID: "b9e5582d-dd71-4ccd-84ea-bc133dce917c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.506945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-scripts" (OuterVolumeSpecName: "scripts") pod "b9e5582d-dd71-4ccd-84ea-bc133dce917c" (UID: "b9e5582d-dd71-4ccd-84ea-bc133dce917c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.509024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9e5582d-dd71-4ccd-84ea-bc133dce917c" (UID: "b9e5582d-dd71-4ccd-84ea-bc133dce917c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.551326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b9e5582d-dd71-4ccd-84ea-bc133dce917c" (UID: "b9e5582d-dd71-4ccd-84ea-bc133dce917c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.571815 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9e5582d-dd71-4ccd-84ea-bc133dce917c-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.571852 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.571867 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.571878 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9e5582d-dd71-4ccd-84ea-bc133dce917c-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.571887 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.571899 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpvlq\" (UniqueName: \"kubernetes.io/projected/b9e5582d-dd71-4ccd-84ea-bc133dce917c-kube-api-access-wpvlq\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.571909 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9e5582d-dd71-4ccd-84ea-bc133dce917c-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.588685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bxs5q" event={"ID":"b9e5582d-dd71-4ccd-84ea-bc133dce917c","Type":"ContainerDied","Data":"fb9334a8edfb0b8e1cda6eef7af057d18b672dd739bb8d8be9f8f8d52b2a2b8b"} Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.588740 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb9334a8edfb0b8e1cda6eef7af057d18b672dd739bb8d8be9f8f8d52b2a2b8b" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.588831 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bxs5q" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.592917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v9z9m" event={"ID":"1cc2eede-643f-4839-a293-5211e57c59ed","Type":"ContainerStarted","Data":"97a0567baefc6d5145483c8d2933ed942b1feedf6734609729fa79e60e6c56dd"} Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.592987 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v9z9m" event={"ID":"1cc2eede-643f-4839-a293-5211e57c59ed","Type":"ContainerStarted","Data":"5ef4ee646dcb8a28f03bc0a5068ead5bb4540e2ff3875063a26ee2312e11228f"} Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.615371 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-v9z9m" podStartSLOduration=8.615350742 podStartE2EDuration="8.615350742s" podCreationTimestamp="2026-01-29 03:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:33.611780252 +0000 UTC m=+1087.096009157" watchObservedRunningTime="2026-01-29 03:45:33.615350742 +0000 UTC m=+1087.099579647" Jan 29 03:45:33 crc kubenswrapper[4707]: E0129 03:45:33.645434 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9e5582d_dd71_4ccd_84ea_bc133dce917c.slice/crio-fb9334a8edfb0b8e1cda6eef7af057d18b672dd739bb8d8be9f8f8d52b2a2b8b\": RecentStats: unable to find data in memory cache]" Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.650255 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hpq5q" podUID="9f831116-140a-4c6b-8d7c-aad99fcaf97c" containerName="ovn-controller" probeResult="failure" output=< Jan 29 03:45:33 crc kubenswrapper[4707]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 29 03:45:33 crc kubenswrapper[4707]: > Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.743329 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hpq5q-config-kv49j"] Jan 29 03:45:33 crc kubenswrapper[4707]: W0129 03:45:33.824477 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0afd1e99_88b4_481e_9af4_8a2eac081dea.slice/crio-e919d3e302e670e9f673d412bd23cfda5b2ed903ef9dd28789f23b63e824d690 WatchSource:0}: Error finding container e919d3e302e670e9f673d412bd23cfda5b2ed903ef9dd28789f23b63e824d690: Status 404 returned error can't find the container with id e919d3e302e670e9f673d412bd23cfda5b2ed903ef9dd28789f23b63e824d690 Jan 29 03:45:33 crc kubenswrapper[4707]: I0129 03:45:33.934406 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 29 03:45:33 crc kubenswrapper[4707]: W0129 03:45:33.948369 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod249edadf_1bb4_4d39_aae3_40384ba10bae.slice/crio-288a48374bdaf74186fd1a586729dc36665f8c02031f1947940d727141ce9848 WatchSource:0}: Error finding container 288a48374bdaf74186fd1a586729dc36665f8c02031f1947940d727141ce9848: Status 404 returned error can't find the container with id 288a48374bdaf74186fd1a586729dc36665f8c02031f1947940d727141ce9848 Jan 29 03:45:34 crc kubenswrapper[4707]: I0129 03:45:34.605198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m74xr" event={"ID":"d65e0ca5-1e58-4492-bd8d-92ff6d516014","Type":"ContainerStarted","Data":"00c45d457cee63fe9383969a0ffe0a8b523bcecdf1d1ad3985daaadbb025260c"} Jan 29 03:45:34 crc kubenswrapper[4707]: I0129 03:45:34.612291 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q-config-kv49j" event={"ID":"0afd1e99-88b4-481e-9af4-8a2eac081dea","Type":"ContainerStarted","Data":"64e6665ac05c704158abd73a0de9a85e1c5b623376a0fa745e63b5c66d3ad44b"} Jan 29 03:45:34 crc kubenswrapper[4707]: I0129 03:45:34.612404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q-config-kv49j" event={"ID":"0afd1e99-88b4-481e-9af4-8a2eac081dea","Type":"ContainerStarted","Data":"e919d3e302e670e9f673d412bd23cfda5b2ed903ef9dd28789f23b63e824d690"} Jan 29 03:45:34 crc kubenswrapper[4707]: I0129 03:45:34.615567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v9z9m" event={"ID":"1cc2eede-643f-4839-a293-5211e57c59ed","Type":"ContainerDied","Data":"97a0567baefc6d5145483c8d2933ed942b1feedf6734609729fa79e60e6c56dd"} Jan 29 03:45:34 crc kubenswrapper[4707]: I0129 03:45:34.615423 4707 generic.go:334] "Generic (PLEG): container finished" podID="1cc2eede-643f-4839-a293-5211e57c59ed" containerID="97a0567baefc6d5145483c8d2933ed942b1feedf6734609729fa79e60e6c56dd" exitCode=0 Jan 29 03:45:34 crc kubenswrapper[4707]: I0129 03:45:34.617271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"288a48374bdaf74186fd1a586729dc36665f8c02031f1947940d727141ce9848"} Jan 29 03:45:34 crc kubenswrapper[4707]: I0129 03:45:34.630627 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-m74xr" podStartSLOduration=3.489408444 podStartE2EDuration="16.630607337s" podCreationTimestamp="2026-01-29 03:45:18 +0000 UTC" firstStartedPulling="2026-01-29 03:45:20.125443152 +0000 UTC m=+1073.609672067" lastFinishedPulling="2026-01-29 03:45:33.266642045 +0000 UTC m=+1086.750870960" observedRunningTime="2026-01-29 03:45:34.627066848 +0000 UTC m=+1088.111295753" watchObservedRunningTime="2026-01-29 03:45:34.630607337 +0000 UTC m=+1088.114836242" Jan 29 03:45:35 crc kubenswrapper[4707]: I0129 03:45:35.626734 4707 generic.go:334] "Generic (PLEG): container finished" podID="0afd1e99-88b4-481e-9af4-8a2eac081dea" containerID="64e6665ac05c704158abd73a0de9a85e1c5b623376a0fa745e63b5c66d3ad44b" exitCode=0 Jan 29 03:45:35 crc kubenswrapper[4707]: I0129 03:45:35.626793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q-config-kv49j" event={"ID":"0afd1e99-88b4-481e-9af4-8a2eac081dea","Type":"ContainerDied","Data":"64e6665ac05c704158abd73a0de9a85e1c5b623376a0fa745e63b5c66d3ad44b"} Jan 29 03:45:35 crc kubenswrapper[4707]: I0129 03:45:35.632951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"1f5fbce362e8b55e40dfeadea70f8503dc765780d5e62caa5e23106a3a469291"} Jan 29 03:45:35 crc kubenswrapper[4707]: I0129 03:45:35.632996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"1ecdb02152ebd209328ff7408011f68d64cf8db33d32f5f6f113c0985e34bbc0"} Jan 29 03:45:35 crc kubenswrapper[4707]: I0129 03:45:35.638646 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 29 03:45:35 crc kubenswrapper[4707]: I0129 03:45:35.765219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.178219 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-66fxr"] Jan 29 03:45:36 crc kubenswrapper[4707]: E0129 03:45:36.230222 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e5582d-dd71-4ccd-84ea-bc133dce917c" containerName="swift-ring-rebalance" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.230281 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e5582d-dd71-4ccd-84ea-bc133dce917c" containerName="swift-ring-rebalance" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.230858 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e5582d-dd71-4ccd-84ea-bc133dce917c" containerName="swift-ring-rebalance" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.239138 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.249028 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-66fxr"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.310663 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-62c6-account-create-update-z9fv5"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.312246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.313652 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.316852 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.333616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-scripts\") pod \"0afd1e99-88b4-481e-9af4-8a2eac081dea\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.333706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6pxt\" (UniqueName: \"kubernetes.io/projected/0afd1e99-88b4-481e-9af4-8a2eac081dea-kube-api-access-n6pxt\") pod \"0afd1e99-88b4-481e-9af4-8a2eac081dea\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.333835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-additional-scripts\") pod \"0afd1e99-88b4-481e-9af4-8a2eac081dea\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.333951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run-ovn\") pod \"0afd1e99-88b4-481e-9af4-8a2eac081dea\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.334064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-log-ovn\") pod \"0afd1e99-88b4-481e-9af4-8a2eac081dea\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.334203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run\") pod \"0afd1e99-88b4-481e-9af4-8a2eac081dea\" (UID: \"0afd1e99-88b4-481e-9af4-8a2eac081dea\") " Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.335144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0afd1e99-88b4-481e-9af4-8a2eac081dea" (UID: "0afd1e99-88b4-481e-9af4-8a2eac081dea"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.335215 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0afd1e99-88b4-481e-9af4-8a2eac081dea" (UID: "0afd1e99-88b4-481e-9af4-8a2eac081dea"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.335652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0afd1e99-88b4-481e-9af4-8a2eac081dea" (UID: "0afd1e99-88b4-481e-9af4-8a2eac081dea"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.336007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-scripts" (OuterVolumeSpecName: "scripts") pod "0afd1e99-88b4-481e-9af4-8a2eac081dea" (UID: "0afd1e99-88b4-481e-9af4-8a2eac081dea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.339470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run" (OuterVolumeSpecName: "var-run") pod "0afd1e99-88b4-481e-9af4-8a2eac081dea" (UID: "0afd1e99-88b4-481e-9af4-8a2eac081dea"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.341381 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-6wzp8"] Jan 29 03:45:36 crc kubenswrapper[4707]: E0129 03:45:36.341974 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afd1e99-88b4-481e-9af4-8a2eac081dea" containerName="ovn-config" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.341997 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afd1e99-88b4-481e-9af4-8a2eac081dea" containerName="ovn-config" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.342234 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afd1e99-88b4-481e-9af4-8a2eac081dea" containerName="ovn-config" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.342988 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.343092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvz5r\" (UniqueName: \"kubernetes.io/projected/f0521623-7e83-4fec-b9b1-3414ae979a0d-kube-api-access-nvz5r\") pod \"cinder-db-create-66fxr\" (UID: \"f0521623-7e83-4fec-b9b1-3414ae979a0d\") " pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.343211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0521623-7e83-4fec-b9b1-3414ae979a0d-operator-scripts\") pod \"cinder-db-create-66fxr\" (UID: \"f0521623-7e83-4fec-b9b1-3414ae979a0d\") " pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.343329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e10248-9205-4bb2-be05-60aa2647f447-operator-scripts\") pod \"heat-62c6-account-create-update-z9fv5\" (UID: \"01e10248-9205-4bb2-be05-60aa2647f447\") " pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.347794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afd1e99-88b4-481e-9af4-8a2eac081dea-kube-api-access-n6pxt" (OuterVolumeSpecName: "kube-api-access-n6pxt") pod "0afd1e99-88b4-481e-9af4-8a2eac081dea" (UID: "0afd1e99-88b4-481e-9af4-8a2eac081dea"). InnerVolumeSpecName "kube-api-access-n6pxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.343530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675tk\" (UniqueName: \"kubernetes.io/projected/01e10248-9205-4bb2-be05-60aa2647f447-kube-api-access-675tk\") pod \"heat-62c6-account-create-update-z9fv5\" (UID: \"01e10248-9205-4bb2-be05-60aa2647f447\") " pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.359769 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.359802 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6pxt\" (UniqueName: \"kubernetes.io/projected/0afd1e99-88b4-481e-9af4-8a2eac081dea-kube-api-access-n6pxt\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.359817 4707 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0afd1e99-88b4-481e-9af4-8a2eac081dea-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.359837 4707 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.359852 4707 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.359864 4707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0afd1e99-88b4-481e-9af4-8a2eac081dea-var-run\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.400144 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-62c6-account-create-update-z9fv5"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.437033 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-6wzp8"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.440712 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.449161 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c73b-account-create-update-fvw6z"] Jan 29 03:45:36 crc kubenswrapper[4707]: E0129 03:45:36.449674 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc2eede-643f-4839-a293-5211e57c59ed" containerName="mariadb-account-create-update" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.449712 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc2eede-643f-4839-a293-5211e57c59ed" containerName="mariadb-account-create-update" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.449893 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc2eede-643f-4839-a293-5211e57c59ed" containerName="mariadb-account-create-update" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.450799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.457420 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.461606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62rmn\" (UniqueName: \"kubernetes.io/projected/f4324488-aeee-4bcb-b62b-8b238db04a68-kube-api-access-62rmn\") pod \"heat-db-create-6wzp8\" (UID: \"f4324488-aeee-4bcb-b62b-8b238db04a68\") " pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.461670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvz5r\" (UniqueName: \"kubernetes.io/projected/f0521623-7e83-4fec-b9b1-3414ae979a0d-kube-api-access-nvz5r\") pod \"cinder-db-create-66fxr\" (UID: \"f0521623-7e83-4fec-b9b1-3414ae979a0d\") " pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.461820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0521623-7e83-4fec-b9b1-3414ae979a0d-operator-scripts\") pod \"cinder-db-create-66fxr\" (UID: \"f0521623-7e83-4fec-b9b1-3414ae979a0d\") " pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.461959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e10248-9205-4bb2-be05-60aa2647f447-operator-scripts\") pod \"heat-62c6-account-create-update-z9fv5\" (UID: \"01e10248-9205-4bb2-be05-60aa2647f447\") " pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.462171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4324488-aeee-4bcb-b62b-8b238db04a68-operator-scripts\") pod \"heat-db-create-6wzp8\" (UID: \"f4324488-aeee-4bcb-b62b-8b238db04a68\") " pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.462251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675tk\" (UniqueName: \"kubernetes.io/projected/01e10248-9205-4bb2-be05-60aa2647f447-kube-api-access-675tk\") pod \"heat-62c6-account-create-update-z9fv5\" (UID: \"01e10248-9205-4bb2-be05-60aa2647f447\") " pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.463008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0521623-7e83-4fec-b9b1-3414ae979a0d-operator-scripts\") pod \"cinder-db-create-66fxr\" (UID: \"f0521623-7e83-4fec-b9b1-3414ae979a0d\") " pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.463779 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e10248-9205-4bb2-be05-60aa2647f447-operator-scripts\") pod \"heat-62c6-account-create-update-z9fv5\" (UID: \"01e10248-9205-4bb2-be05-60aa2647f447\") " pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.475254 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-swv4z"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.476630 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.484720 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c73b-account-create-update-fvw6z"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.492708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-675tk\" (UniqueName: \"kubernetes.io/projected/01e10248-9205-4bb2-be05-60aa2647f447-kube-api-access-675tk\") pod \"heat-62c6-account-create-update-z9fv5\" (UID: \"01e10248-9205-4bb2-be05-60aa2647f447\") " pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.495652 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-swv4z"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.499732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvz5r\" (UniqueName: \"kubernetes.io/projected/f0521623-7e83-4fec-b9b1-3414ae979a0d-kube-api-access-nvz5r\") pod \"cinder-db-create-66fxr\" (UID: \"f0521623-7e83-4fec-b9b1-3414ae979a0d\") " pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.557420 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e746-account-create-update-s9k4v"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.559090 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.561116 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.565849 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e746-account-create-update-s9k4v"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.566895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l88ft\" (UniqueName: \"kubernetes.io/projected/1cc2eede-643f-4839-a293-5211e57c59ed-kube-api-access-l88ft\") pod \"1cc2eede-643f-4839-a293-5211e57c59ed\" (UID: \"1cc2eede-643f-4839-a293-5211e57c59ed\") " Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.567201 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc2eede-643f-4839-a293-5211e57c59ed-operator-scripts\") pod \"1cc2eede-643f-4839-a293-5211e57c59ed\" (UID: \"1cc2eede-643f-4839-a293-5211e57c59ed\") " Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.567503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htqsw\" (UniqueName: \"kubernetes.io/projected/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-kube-api-access-htqsw\") pod \"barbican-db-create-swv4z\" (UID: \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\") " pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.567565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-operator-scripts\") pod \"barbican-db-create-swv4z\" (UID: \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\") " pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.568654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4324488-aeee-4bcb-b62b-8b238db04a68-operator-scripts\") pod \"heat-db-create-6wzp8\" (UID: \"f4324488-aeee-4bcb-b62b-8b238db04a68\") " pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.568686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a74e51-b46b-4d96-ba78-5073504fb9c5-operator-scripts\") pod \"barbican-c73b-account-create-update-fvw6z\" (UID: \"32a74e51-b46b-4d96-ba78-5073504fb9c5\") " pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.568764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62rmn\" (UniqueName: \"kubernetes.io/projected/f4324488-aeee-4bcb-b62b-8b238db04a68-kube-api-access-62rmn\") pod \"heat-db-create-6wzp8\" (UID: \"f4324488-aeee-4bcb-b62b-8b238db04a68\") " pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.568911 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc2eede-643f-4839-a293-5211e57c59ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cc2eede-643f-4839-a293-5211e57c59ed" (UID: "1cc2eede-643f-4839-a293-5211e57c59ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.569582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4324488-aeee-4bcb-b62b-8b238db04a68-operator-scripts\") pod \"heat-db-create-6wzp8\" (UID: \"f4324488-aeee-4bcb-b62b-8b238db04a68\") " pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.569660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdmc\" (UniqueName: \"kubernetes.io/projected/32a74e51-b46b-4d96-ba78-5073504fb9c5-kube-api-access-xwdmc\") pod \"barbican-c73b-account-create-update-fvw6z\" (UID: \"32a74e51-b46b-4d96-ba78-5073504fb9c5\") " pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.569875 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc2eede-643f-4839-a293-5211e57c59ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.574588 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wwvj2"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.584804 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc2eede-643f-4839-a293-5211e57c59ed-kube-api-access-l88ft" (OuterVolumeSpecName: "kube-api-access-l88ft") pod "1cc2eede-643f-4839-a293-5211e57c59ed" (UID: "1cc2eede-643f-4839-a293-5211e57c59ed"). InnerVolumeSpecName "kube-api-access-l88ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.585643 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.587023 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wwvj2"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.595135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.595960 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.596221 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.596249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdlvf" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.601043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62rmn\" (UniqueName: \"kubernetes.io/projected/f4324488-aeee-4bcb-b62b-8b238db04a68-kube-api-access-62rmn\") pod \"heat-db-create-6wzp8\" (UID: \"f4324488-aeee-4bcb-b62b-8b238db04a68\") " pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.657305 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q-config-kv49j" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.657333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q-config-kv49j" event={"ID":"0afd1e99-88b4-481e-9af4-8a2eac081dea","Type":"ContainerDied","Data":"e919d3e302e670e9f673d412bd23cfda5b2ed903ef9dd28789f23b63e824d690"} Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.657397 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e919d3e302e670e9f673d412bd23cfda5b2ed903ef9dd28789f23b63e824d690" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.661197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v9z9m" event={"ID":"1cc2eede-643f-4839-a293-5211e57c59ed","Type":"ContainerDied","Data":"5ef4ee646dcb8a28f03bc0a5068ead5bb4540e2ff3875063a26ee2312e11228f"} Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.661216 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v9z9m" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.661226 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ef4ee646dcb8a28f03bc0a5068ead5bb4540e2ff3875063a26ee2312e11228f" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.672864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a74e51-b46b-4d96-ba78-5073504fb9c5-operator-scripts\") pod \"barbican-c73b-account-create-update-fvw6z\" (UID: \"32a74e51-b46b-4d96-ba78-5073504fb9c5\") " pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.672958 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db7e1d7-27c8-4f26-9288-fb94302ba13b-operator-scripts\") pod \"cinder-e746-account-create-update-s9k4v\" (UID: \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\") " pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.673011 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-combined-ca-bundle\") pod \"keystone-db-sync-wwvj2\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.673035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdmc\" (UniqueName: \"kubernetes.io/projected/32a74e51-b46b-4d96-ba78-5073504fb9c5-kube-api-access-xwdmc\") pod \"barbican-c73b-account-create-update-fvw6z\" (UID: \"32a74e51-b46b-4d96-ba78-5073504fb9c5\") " pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.673070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htqsw\" (UniqueName: \"kubernetes.io/projected/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-kube-api-access-htqsw\") pod \"barbican-db-create-swv4z\" (UID: \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\") " pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.673095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5s9s\" (UniqueName: \"kubernetes.io/projected/f945e029-2a96-43ab-93aa-556eeadfda35-kube-api-access-m5s9s\") pod \"keystone-db-sync-wwvj2\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.673115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-operator-scripts\") pod \"barbican-db-create-swv4z\" (UID: \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\") " pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.673147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nvst\" (UniqueName: \"kubernetes.io/projected/2db7e1d7-27c8-4f26-9288-fb94302ba13b-kube-api-access-5nvst\") pod \"cinder-e746-account-create-update-s9k4v\" (UID: \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\") " pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.673172 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-config-data\") pod \"keystone-db-sync-wwvj2\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.673237 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l88ft\" (UniqueName: \"kubernetes.io/projected/1cc2eede-643f-4839-a293-5211e57c59ed-kube-api-access-l88ft\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.673926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a74e51-b46b-4d96-ba78-5073504fb9c5-operator-scripts\") pod \"barbican-c73b-account-create-update-fvw6z\" (UID: \"32a74e51-b46b-4d96-ba78-5073504fb9c5\") " pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.675150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-operator-scripts\") pod \"barbican-db-create-swv4z\" (UID: \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\") " pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.676140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"4ef3c9036a4221ba329c79d2aa0530dd7eb6ff6bab51f250285e3f4d0a96448e"} Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.676185 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"f5a534bb8c7a2303ff5bc5c284e45d7b74dc093b2b95d3b2143981deedcd438e"} Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.694551 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htqsw\" (UniqueName: \"kubernetes.io/projected/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-kube-api-access-htqsw\") pod \"barbican-db-create-swv4z\" (UID: \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\") " pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.695397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdmc\" (UniqueName: \"kubernetes.io/projected/32a74e51-b46b-4d96-ba78-5073504fb9c5-kube-api-access-xwdmc\") pod \"barbican-c73b-account-create-update-fvw6z\" (UID: \"32a74e51-b46b-4d96-ba78-5073504fb9c5\") " pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.718127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.733484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.780800 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.781374 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-combined-ca-bundle\") pod \"keystone-db-sync-wwvj2\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.781451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5s9s\" (UniqueName: \"kubernetes.io/projected/f945e029-2a96-43ab-93aa-556eeadfda35-kube-api-access-m5s9s\") pod \"keystone-db-sync-wwvj2\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.781487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nvst\" (UniqueName: \"kubernetes.io/projected/2db7e1d7-27c8-4f26-9288-fb94302ba13b-kube-api-access-5nvst\") pod \"cinder-e746-account-create-update-s9k4v\" (UID: \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\") " pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.781513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-config-data\") pod \"keystone-db-sync-wwvj2\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.781592 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db7e1d7-27c8-4f26-9288-fb94302ba13b-operator-scripts\") pod \"cinder-e746-account-create-update-s9k4v\" (UID: \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\") " pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.782361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db7e1d7-27c8-4f26-9288-fb94302ba13b-operator-scripts\") pod \"cinder-e746-account-create-update-s9k4v\" (UID: \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\") " pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.791478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-combined-ca-bundle\") pod \"keystone-db-sync-wwvj2\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.792012 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.793436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-config-data\") pod \"keystone-db-sync-wwvj2\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.805156 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.812662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nvst\" (UniqueName: \"kubernetes.io/projected/2db7e1d7-27c8-4f26-9288-fb94302ba13b-kube-api-access-5nvst\") pod \"cinder-e746-account-create-update-s9k4v\" (UID: \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\") " pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.819361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5s9s\" (UniqueName: \"kubernetes.io/projected/f945e029-2a96-43ab-93aa-556eeadfda35-kube-api-access-m5s9s\") pod \"keystone-db-sync-wwvj2\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.825337 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-trkf9"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.826981 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.842642 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d1f5-account-create-update-7p7sh"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.844054 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.845693 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.852399 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-trkf9"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.863002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d1f5-account-create-update-7p7sh"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.883173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8mk5\" (UniqueName: \"kubernetes.io/projected/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-kube-api-access-b8mk5\") pod \"neutron-db-create-trkf9\" (UID: \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\") " pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.883262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-operator-scripts\") pod \"neutron-d1f5-account-create-update-7p7sh\" (UID: \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\") " pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.883303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-operator-scripts\") pod \"neutron-db-create-trkf9\" (UID: \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\") " pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.883331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2rlz\" (UniqueName: \"kubernetes.io/projected/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-kube-api-access-d2rlz\") pod \"neutron-d1f5-account-create-update-7p7sh\" (UID: \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\") " pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.897027 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.917554 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.934634 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-v9z9m"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.938085 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-v9z9m"] Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.985618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-operator-scripts\") pod \"neutron-d1f5-account-create-update-7p7sh\" (UID: \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\") " pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.985686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-operator-scripts\") pod \"neutron-db-create-trkf9\" (UID: \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\") " pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.985741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2rlz\" (UniqueName: \"kubernetes.io/projected/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-kube-api-access-d2rlz\") pod \"neutron-d1f5-account-create-update-7p7sh\" (UID: \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\") " pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.985823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8mk5\" (UniqueName: \"kubernetes.io/projected/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-kube-api-access-b8mk5\") pod \"neutron-db-create-trkf9\" (UID: \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\") " pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.987979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-operator-scripts\") pod \"neutron-db-create-trkf9\" (UID: \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\") " pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:36 crc kubenswrapper[4707]: I0129 03:45:36.988220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-operator-scripts\") pod \"neutron-d1f5-account-create-update-7p7sh\" (UID: \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\") " pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.006562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8mk5\" (UniqueName: \"kubernetes.io/projected/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-kube-api-access-b8mk5\") pod \"neutron-db-create-trkf9\" (UID: \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\") " pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.011461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2rlz\" (UniqueName: \"kubernetes.io/projected/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-kube-api-access-d2rlz\") pod \"neutron-d1f5-account-create-update-7p7sh\" (UID: \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\") " pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.201700 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.208766 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.334995 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc2eede-643f-4839-a293-5211e57c59ed" path="/var/lib/kubelet/pods/1cc2eede-643f-4839-a293-5211e57c59ed/volumes" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.495190 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-62c6-account-create-update-z9fv5"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.528901 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hpq5q-config-kv49j"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.540961 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hpq5q-config-kv49j"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.642331 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hpq5q-config-zws9h"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.643671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.655912 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.681982 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hpq5q-config-zws9h"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.704561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-log-ovn\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.704622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7p5s\" (UniqueName: \"kubernetes.io/projected/04448c3b-f457-4bbb-977f-be67a9a4ba75-kube-api-access-k7p5s\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.704702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-scripts\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.704728 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run-ovn\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.704752 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-additional-scripts\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.704787 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.761623 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c73b-account-create-update-fvw6z"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.785347 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e746-account-create-update-s9k4v"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.799164 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-66fxr"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.807957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-log-ovn\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.808497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7p5s\" (UniqueName: \"kubernetes.io/projected/04448c3b-f457-4bbb-977f-be67a9a4ba75-kube-api-access-k7p5s\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.808701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-scripts\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.808732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run-ovn\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.808777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-additional-scripts\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.808898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.809162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.808405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-log-ovn\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.810868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run-ovn\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.812104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-additional-scripts\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.814637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-scripts\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.831236 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-swv4z"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.838377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7p5s\" (UniqueName: \"kubernetes.io/projected/04448c3b-f457-4bbb-977f-be67a9a4ba75-kube-api-access-k7p5s\") pod \"ovn-controller-hpq5q-config-zws9h\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:37 crc kubenswrapper[4707]: W0129 03:45:37.871657 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a74e51_b46b_4d96_ba78_5073504fb9c5.slice/crio-6cfa8ff92b9fc5c5329342e9bc946cab9b9a2b177fe10b5e8c7e606ea7c0551b WatchSource:0}: Error finding container 6cfa8ff92b9fc5c5329342e9bc946cab9b9a2b177fe10b5e8c7e606ea7c0551b: Status 404 returned error can't find the container with id 6cfa8ff92b9fc5c5329342e9bc946cab9b9a2b177fe10b5e8c7e606ea7c0551b Jan 29 03:45:37 crc kubenswrapper[4707]: W0129 03:45:37.886660 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01e10248_9205_4bb2_be05_60aa2647f447.slice/crio-c929467c7f214aa422233ceec9b9b07552e6f8ace4c8a044790825bd6bbf6e2e WatchSource:0}: Error finding container c929467c7f214aa422233ceec9b9b07552e6f8ace4c8a044790825bd6bbf6e2e: Status 404 returned error can't find the container with id c929467c7f214aa422233ceec9b9b07552e6f8ace4c8a044790825bd6bbf6e2e Jan 29 03:45:37 crc kubenswrapper[4707]: W0129 03:45:37.888649 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2db7e1d7_27c8_4f26_9288_fb94302ba13b.slice/crio-e7c4ea82df4d28d5d025cc1559277da10b9b27afa775a09993b894f7391b58f2 WatchSource:0}: Error finding container e7c4ea82df4d28d5d025cc1559277da10b9b27afa775a09993b894f7391b58f2: Status 404 returned error can't find the container with id e7c4ea82df4d28d5d025cc1559277da10b9b27afa775a09993b894f7391b58f2 Jan 29 03:45:37 crc kubenswrapper[4707]: W0129 03:45:37.890262 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0046b5a1_ddfb_44d6_9a24_301c0cf61b75.slice/crio-69417db1b3251e7edc5520a64a6b8b692eb16bf0ce52704dd29d03c5880bf9b4 WatchSource:0}: Error finding container 69417db1b3251e7edc5520a64a6b8b692eb16bf0ce52704dd29d03c5880bf9b4: Status 404 returned error can't find the container with id 69417db1b3251e7edc5520a64a6b8b692eb16bf0ce52704dd29d03c5880bf9b4 Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.972237 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-6wzp8"] Jan 29 03:45:37 crc kubenswrapper[4707]: I0129 03:45:37.986340 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.005292 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wwvj2"] Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.611409 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-trkf9"] Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.681666 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d1f5-account-create-update-7p7sh"] Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.710916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e746-account-create-update-s9k4v" event={"ID":"2db7e1d7-27c8-4f26-9288-fb94302ba13b","Type":"ContainerStarted","Data":"e7c4ea82df4d28d5d025cc1559277da10b9b27afa775a09993b894f7391b58f2"} Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.712817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-swv4z" event={"ID":"0046b5a1-ddfb-44d6-9a24-301c0cf61b75","Type":"ContainerStarted","Data":"98ae48325409a0d42880eb8cb061055c35df5a5c804948408ca6406a72c8e2cc"} Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.712860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-swv4z" event={"ID":"0046b5a1-ddfb-44d6-9a24-301c0cf61b75","Type":"ContainerStarted","Data":"69417db1b3251e7edc5520a64a6b8b692eb16bf0ce52704dd29d03c5880bf9b4"} Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.723407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wwvj2" event={"ID":"f945e029-2a96-43ab-93aa-556eeadfda35","Type":"ContainerStarted","Data":"fd9095dd1b23c85a4da1bab34593bc8595534d1ebadaa74c7f9efb251d1ea1f7"} Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.744131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-6wzp8" event={"ID":"f4324488-aeee-4bcb-b62b-8b238db04a68","Type":"ContainerStarted","Data":"f5ecae564421294ace8cba188cb4855056be697649c2977c6a9546ea49fb44ec"} Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.751502 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-swv4z" podStartSLOduration=2.751481036 podStartE2EDuration="2.751481036s" podCreationTimestamp="2026-01-29 03:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:38.744046637 +0000 UTC m=+1092.228275542" watchObservedRunningTime="2026-01-29 03:45:38.751481036 +0000 UTC m=+1092.235709941" Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.768493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-66fxr" event={"ID":"f0521623-7e83-4fec-b9b1-3414ae979a0d","Type":"ContainerStarted","Data":"ef158d4a57db53f20fb9ca8511e7e57575fc87180dad99e905fbb24a24e2340f"} Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.768572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-66fxr" event={"ID":"f0521623-7e83-4fec-b9b1-3414ae979a0d","Type":"ContainerStarted","Data":"d22cb8a612a9e1619fe7ec7d3548aa2a573734e832d674df6dc73823fdfd4c8c"} Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.777382 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c73b-account-create-update-fvw6z" event={"ID":"32a74e51-b46b-4d96-ba78-5073504fb9c5","Type":"ContainerStarted","Data":"6cfa8ff92b9fc5c5329342e9bc946cab9b9a2b177fe10b5e8c7e606ea7c0551b"} Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.778716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-62c6-account-create-update-z9fv5" event={"ID":"01e10248-9205-4bb2-be05-60aa2647f447","Type":"ContainerStarted","Data":"c929467c7f214aa422233ceec9b9b07552e6f8ace4c8a044790825bd6bbf6e2e"} Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.817711 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-66fxr" podStartSLOduration=2.817681506 podStartE2EDuration="2.817681506s" podCreationTimestamp="2026-01-29 03:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:38.807278484 +0000 UTC m=+1092.291507389" watchObservedRunningTime="2026-01-29 03:45:38.817681506 +0000 UTC m=+1092.301910431" Jan 29 03:45:38 crc kubenswrapper[4707]: W0129 03:45:38.838438 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d0c5c1a_19a4_475f_a810_71feb6ff1d5f.slice/crio-38fdd46200ab652e2af2335b2cd169e4d2c3b1975a4924a672acc3f839eef683 WatchSource:0}: Error finding container 38fdd46200ab652e2af2335b2cd169e4d2c3b1975a4924a672acc3f839eef683: Status 404 returned error can't find the container with id 38fdd46200ab652e2af2335b2cd169e4d2c3b1975a4924a672acc3f839eef683 Jan 29 03:45:38 crc kubenswrapper[4707]: W0129 03:45:38.838865 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf4b89d8_90c5_4ac0_a4e0_4c4d8bffbcc5.slice/crio-7a30f5d2a97533f3cc1c8a96e216b2c8cb4b61a2e3fed7810dccd78ae4b84672 WatchSource:0}: Error finding container 7a30f5d2a97533f3cc1c8a96e216b2c8cb4b61a2e3fed7810dccd78ae4b84672: Status 404 returned error can't find the container with id 7a30f5d2a97533f3cc1c8a96e216b2c8cb4b61a2e3fed7810dccd78ae4b84672 Jan 29 03:45:38 crc kubenswrapper[4707]: I0129 03:45:38.939556 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hpq5q-config-zws9h"] Jan 29 03:45:38 crc kubenswrapper[4707]: W0129 03:45:38.972130 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04448c3b_f457_4bbb_977f_be67a9a4ba75.slice/crio-1ee3bc7f9390c02d9c32106f52a35b0876072efe0a706ec71334aad9564b935a WatchSource:0}: Error finding container 1ee3bc7f9390c02d9c32106f52a35b0876072efe0a706ec71334aad9564b935a: Status 404 returned error can't find the container with id 1ee3bc7f9390c02d9c32106f52a35b0876072efe0a706ec71334aad9564b935a Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.082901 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hpq5q" Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.281144 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afd1e99-88b4-481e-9af4-8a2eac081dea" path="/var/lib/kubelet/pods/0afd1e99-88b4-481e-9af4-8a2eac081dea/volumes" Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.789919 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0521623-7e83-4fec-b9b1-3414ae979a0d" containerID="ef158d4a57db53f20fb9ca8511e7e57575fc87180dad99e905fbb24a24e2340f" exitCode=0 Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.789990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-66fxr" event={"ID":"f0521623-7e83-4fec-b9b1-3414ae979a0d","Type":"ContainerDied","Data":"ef158d4a57db53f20fb9ca8511e7e57575fc87180dad99e905fbb24a24e2340f"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.791623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d1f5-account-create-update-7p7sh" event={"ID":"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5","Type":"ContainerStarted","Data":"3d2fee166692052d0ce695e9235b93bbcd303489695f1256d8032871a784429c"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.791682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d1f5-account-create-update-7p7sh" event={"ID":"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5","Type":"ContainerStarted","Data":"7a30f5d2a97533f3cc1c8a96e216b2c8cb4b61a2e3fed7810dccd78ae4b84672"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.793182 4707 generic.go:334] "Generic (PLEG): container finished" podID="2db7e1d7-27c8-4f26-9288-fb94302ba13b" containerID="55e11662761f02f30e62ca9478f8b30cbc625cf1d1af2799412407c9f3e2c87d" exitCode=0 Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.793311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e746-account-create-update-s9k4v" event={"ID":"2db7e1d7-27c8-4f26-9288-fb94302ba13b","Type":"ContainerDied","Data":"55e11662761f02f30e62ca9478f8b30cbc625cf1d1af2799412407c9f3e2c87d"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.795314 4707 generic.go:334] "Generic (PLEG): container finished" podID="0046b5a1-ddfb-44d6-9a24-301c0cf61b75" containerID="98ae48325409a0d42880eb8cb061055c35df5a5c804948408ca6406a72c8e2cc" exitCode=0 Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.795375 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-swv4z" event={"ID":"0046b5a1-ddfb-44d6-9a24-301c0cf61b75","Type":"ContainerDied","Data":"98ae48325409a0d42880eb8cb061055c35df5a5c804948408ca6406a72c8e2cc"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.802488 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d0c5c1a-19a4-475f-a810-71feb6ff1d5f" containerID="6aeba6acbc0e96892797cb8e103a6dabb18bf651bac1a97c4c4a80df45932d74" exitCode=0 Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.802566 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-trkf9" event={"ID":"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f","Type":"ContainerDied","Data":"6aeba6acbc0e96892797cb8e103a6dabb18bf651bac1a97c4c4a80df45932d74"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.802643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-trkf9" event={"ID":"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f","Type":"ContainerStarted","Data":"38fdd46200ab652e2af2335b2cd169e4d2c3b1975a4924a672acc3f839eef683"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.810896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"81d6d372fec692768cc6088b0b9eba972618726be0943c9307f7c0563fb134fb"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.810952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"4059e98fdb604c99e86582fef899f76b65fdae7ad80702072600b69cd76aeb1b"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.816009 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4324488-aeee-4bcb-b62b-8b238db04a68" containerID="e863074f4f2d6fa39c0994f88ffdaee2b48fbba4e27705032f9821c96cf8d356" exitCode=0 Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.816080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-6wzp8" event={"ID":"f4324488-aeee-4bcb-b62b-8b238db04a68","Type":"ContainerDied","Data":"e863074f4f2d6fa39c0994f88ffdaee2b48fbba4e27705032f9821c96cf8d356"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.825169 4707 generic.go:334] "Generic (PLEG): container finished" podID="32a74e51-b46b-4d96-ba78-5073504fb9c5" containerID="a8160b81dabed31cb475030f365e6ea1de1347a88ad2958c908250de93b3e35c" exitCode=0 Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.825232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c73b-account-create-update-fvw6z" event={"ID":"32a74e51-b46b-4d96-ba78-5073504fb9c5","Type":"ContainerDied","Data":"a8160b81dabed31cb475030f365e6ea1de1347a88ad2958c908250de93b3e35c"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.829335 4707 generic.go:334] "Generic (PLEG): container finished" podID="01e10248-9205-4bb2-be05-60aa2647f447" containerID="9c5d58d279b3629677204620cb901b5983d157b881a20e83d92529f049ad3b78" exitCode=0 Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.829417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-62c6-account-create-update-z9fv5" event={"ID":"01e10248-9205-4bb2-be05-60aa2647f447","Type":"ContainerDied","Data":"9c5d58d279b3629677204620cb901b5983d157b881a20e83d92529f049ad3b78"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.833917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q-config-zws9h" event={"ID":"04448c3b-f457-4bbb-977f-be67a9a4ba75","Type":"ContainerStarted","Data":"7485bbd2f333b34fd80b590f767a1eab7984596b3169b18b67d25cead81e02a7"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.833990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q-config-zws9h" event={"ID":"04448c3b-f457-4bbb-977f-be67a9a4ba75","Type":"ContainerStarted","Data":"1ee3bc7f9390c02d9c32106f52a35b0876072efe0a706ec71334aad9564b935a"} Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.865711 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d1f5-account-create-update-7p7sh" podStartSLOduration=3.8656840409999997 podStartE2EDuration="3.865684041s" podCreationTimestamp="2026-01-29 03:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:39.863357826 +0000 UTC m=+1093.347586731" watchObservedRunningTime="2026-01-29 03:45:39.865684041 +0000 UTC m=+1093.349912946" Jan 29 03:45:39 crc kubenswrapper[4707]: I0129 03:45:39.935221 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hpq5q-config-zws9h" podStartSLOduration=2.935199724 podStartE2EDuration="2.935199724s" podCreationTimestamp="2026-01-29 03:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:39.929878045 +0000 UTC m=+1093.414106950" watchObservedRunningTime="2026-01-29 03:45:39.935199724 +0000 UTC m=+1093.419428629" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.498769 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-w56h9"] Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.500443 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.503667 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.529045 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-w56h9"] Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.598037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q27pl\" (UniqueName: \"kubernetes.io/projected/8f60529f-12d2-41ed-8b70-6c63bdadcb55-kube-api-access-q27pl\") pod \"root-account-create-update-w56h9\" (UID: \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\") " pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.598761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f60529f-12d2-41ed-8b70-6c63bdadcb55-operator-scripts\") pod \"root-account-create-update-w56h9\" (UID: \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\") " pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.700810 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q27pl\" (UniqueName: \"kubernetes.io/projected/8f60529f-12d2-41ed-8b70-6c63bdadcb55-kube-api-access-q27pl\") pod \"root-account-create-update-w56h9\" (UID: \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\") " pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.701656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f60529f-12d2-41ed-8b70-6c63bdadcb55-operator-scripts\") pod \"root-account-create-update-w56h9\" (UID: \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\") " pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.702726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f60529f-12d2-41ed-8b70-6c63bdadcb55-operator-scripts\") pod \"root-account-create-update-w56h9\" (UID: \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\") " pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.721937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q27pl\" (UniqueName: \"kubernetes.io/projected/8f60529f-12d2-41ed-8b70-6c63bdadcb55-kube-api-access-q27pl\") pod \"root-account-create-update-w56h9\" (UID: \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\") " pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.820720 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.864318 4707 generic.go:334] "Generic (PLEG): container finished" podID="cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5" containerID="3d2fee166692052d0ce695e9235b93bbcd303489695f1256d8032871a784429c" exitCode=0 Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.864408 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d1f5-account-create-update-7p7sh" event={"ID":"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5","Type":"ContainerDied","Data":"3d2fee166692052d0ce695e9235b93bbcd303489695f1256d8032871a784429c"} Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.883254 4707 generic.go:334] "Generic (PLEG): container finished" podID="04448c3b-f457-4bbb-977f-be67a9a4ba75" containerID="7485bbd2f333b34fd80b590f767a1eab7984596b3169b18b67d25cead81e02a7" exitCode=0 Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.883341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q-config-zws9h" event={"ID":"04448c3b-f457-4bbb-977f-be67a9a4ba75","Type":"ContainerDied","Data":"7485bbd2f333b34fd80b590f767a1eab7984596b3169b18b67d25cead81e02a7"} Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.892786 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"08c54fc657dcbd64f055771c3e38e05b7b69efb6683cc814939f99db67f548ef"} Jan 29 03:45:40 crc kubenswrapper[4707]: I0129 03:45:40.892838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"6b68828f2a3724f49aea26472a598bcdd200ec50c4a66c14923ebb3ea4ba995e"} Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.482777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.618314 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db7e1d7-27c8-4f26-9288-fb94302ba13b-operator-scripts\") pod \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\" (UID: \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.618412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nvst\" (UniqueName: \"kubernetes.io/projected/2db7e1d7-27c8-4f26-9288-fb94302ba13b-kube-api-access-5nvst\") pod \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\" (UID: \"2db7e1d7-27c8-4f26-9288-fb94302ba13b\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.619292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db7e1d7-27c8-4f26-9288-fb94302ba13b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2db7e1d7-27c8-4f26-9288-fb94302ba13b" (UID: "2db7e1d7-27c8-4f26-9288-fb94302ba13b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.626717 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db7e1d7-27c8-4f26-9288-fb94302ba13b-kube-api-access-5nvst" (OuterVolumeSpecName: "kube-api-access-5nvst") pod "2db7e1d7-27c8-4f26-9288-fb94302ba13b" (UID: "2db7e1d7-27c8-4f26-9288-fb94302ba13b"). InnerVolumeSpecName "kube-api-access-5nvst". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.718928 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.729871 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nvst\" (UniqueName: \"kubernetes.io/projected/2db7e1d7-27c8-4f26-9288-fb94302ba13b-kube-api-access-5nvst\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.729942 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2db7e1d7-27c8-4f26-9288-fb94302ba13b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.735470 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.763096 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.797936 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.800684 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.810252 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.832465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-operator-scripts\") pod \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\" (UID: \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.832607 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htqsw\" (UniqueName: \"kubernetes.io/projected/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-kube-api-access-htqsw\") pod \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\" (UID: \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.832642 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8mk5\" (UniqueName: \"kubernetes.io/projected/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-kube-api-access-b8mk5\") pod \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\" (UID: \"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.832685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvz5r\" (UniqueName: \"kubernetes.io/projected/f0521623-7e83-4fec-b9b1-3414ae979a0d-kube-api-access-nvz5r\") pod \"f0521623-7e83-4fec-b9b1-3414ae979a0d\" (UID: \"f0521623-7e83-4fec-b9b1-3414ae979a0d\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.832757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62rmn\" (UniqueName: \"kubernetes.io/projected/f4324488-aeee-4bcb-b62b-8b238db04a68-kube-api-access-62rmn\") pod \"f4324488-aeee-4bcb-b62b-8b238db04a68\" (UID: \"f4324488-aeee-4bcb-b62b-8b238db04a68\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.832802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a74e51-b46b-4d96-ba78-5073504fb9c5-operator-scripts\") pod \"32a74e51-b46b-4d96-ba78-5073504fb9c5\" (UID: \"32a74e51-b46b-4d96-ba78-5073504fb9c5\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.832896 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-675tk\" (UniqueName: \"kubernetes.io/projected/01e10248-9205-4bb2-be05-60aa2647f447-kube-api-access-675tk\") pod \"01e10248-9205-4bb2-be05-60aa2647f447\" (UID: \"01e10248-9205-4bb2-be05-60aa2647f447\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.832945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-operator-scripts\") pod \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\" (UID: \"0046b5a1-ddfb-44d6-9a24-301c0cf61b75\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.833011 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e10248-9205-4bb2-be05-60aa2647f447-operator-scripts\") pod \"01e10248-9205-4bb2-be05-60aa2647f447\" (UID: \"01e10248-9205-4bb2-be05-60aa2647f447\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.833031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0521623-7e83-4fec-b9b1-3414ae979a0d-operator-scripts\") pod \"f0521623-7e83-4fec-b9b1-3414ae979a0d\" (UID: \"f0521623-7e83-4fec-b9b1-3414ae979a0d\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.833085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4324488-aeee-4bcb-b62b-8b238db04a68-operator-scripts\") pod \"f4324488-aeee-4bcb-b62b-8b238db04a68\" (UID: \"f4324488-aeee-4bcb-b62b-8b238db04a68\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.833138 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwdmc\" (UniqueName: \"kubernetes.io/projected/32a74e51-b46b-4d96-ba78-5073504fb9c5-kube-api-access-xwdmc\") pod \"32a74e51-b46b-4d96-ba78-5073504fb9c5\" (UID: \"32a74e51-b46b-4d96-ba78-5073504fb9c5\") " Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.834114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01e10248-9205-4bb2-be05-60aa2647f447-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01e10248-9205-4bb2-be05-60aa2647f447" (UID: "01e10248-9205-4bb2-be05-60aa2647f447"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.834148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a74e51-b46b-4d96-ba78-5073504fb9c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32a74e51-b46b-4d96-ba78-5073504fb9c5" (UID: "32a74e51-b46b-4d96-ba78-5073504fb9c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.834692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d0c5c1a-19a4-475f-a810-71feb6ff1d5f" (UID: "9d0c5c1a-19a4-475f-a810-71feb6ff1d5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.838109 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e10248-9205-4bb2-be05-60aa2647f447-kube-api-access-675tk" (OuterVolumeSpecName: "kube-api-access-675tk") pod "01e10248-9205-4bb2-be05-60aa2647f447" (UID: "01e10248-9205-4bb2-be05-60aa2647f447"). InnerVolumeSpecName "kube-api-access-675tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.838439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0046b5a1-ddfb-44d6-9a24-301c0cf61b75" (UID: "0046b5a1-ddfb-44d6-9a24-301c0cf61b75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.839611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a74e51-b46b-4d96-ba78-5073504fb9c5-kube-api-access-xwdmc" (OuterVolumeSpecName: "kube-api-access-xwdmc") pod "32a74e51-b46b-4d96-ba78-5073504fb9c5" (UID: "32a74e51-b46b-4d96-ba78-5073504fb9c5"). InnerVolumeSpecName "kube-api-access-xwdmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.840315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0521623-7e83-4fec-b9b1-3414ae979a0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0521623-7e83-4fec-b9b1-3414ae979a0d" (UID: "f0521623-7e83-4fec-b9b1-3414ae979a0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.840858 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4324488-aeee-4bcb-b62b-8b238db04a68-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4324488-aeee-4bcb-b62b-8b238db04a68" (UID: "f4324488-aeee-4bcb-b62b-8b238db04a68"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.840927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0521623-7e83-4fec-b9b1-3414ae979a0d-kube-api-access-nvz5r" (OuterVolumeSpecName: "kube-api-access-nvz5r") pod "f0521623-7e83-4fec-b9b1-3414ae979a0d" (UID: "f0521623-7e83-4fec-b9b1-3414ae979a0d"). InnerVolumeSpecName "kube-api-access-nvz5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.841243 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-kube-api-access-b8mk5" (OuterVolumeSpecName: "kube-api-access-b8mk5") pod "9d0c5c1a-19a4-475f-a810-71feb6ff1d5f" (UID: "9d0c5c1a-19a4-475f-a810-71feb6ff1d5f"). InnerVolumeSpecName "kube-api-access-b8mk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.844074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4324488-aeee-4bcb-b62b-8b238db04a68-kube-api-access-62rmn" (OuterVolumeSpecName: "kube-api-access-62rmn") pod "f4324488-aeee-4bcb-b62b-8b238db04a68" (UID: "f4324488-aeee-4bcb-b62b-8b238db04a68"). InnerVolumeSpecName "kube-api-access-62rmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.845064 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-kube-api-access-htqsw" (OuterVolumeSpecName: "kube-api-access-htqsw") pod "0046b5a1-ddfb-44d6-9a24-301c0cf61b75" (UID: "0046b5a1-ddfb-44d6-9a24-301c0cf61b75"). InnerVolumeSpecName "kube-api-access-htqsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.892718 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-w56h9"] Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.906568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-66fxr" event={"ID":"f0521623-7e83-4fec-b9b1-3414ae979a0d","Type":"ContainerDied","Data":"d22cb8a612a9e1619fe7ec7d3548aa2a573734e832d674df6dc73823fdfd4c8c"} Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.906915 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d22cb8a612a9e1619fe7ec7d3548aa2a573734e832d674df6dc73823fdfd4c8c" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.906984 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-66fxr" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.916019 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c73b-account-create-update-fvw6z" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.916013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c73b-account-create-update-fvw6z" event={"ID":"32a74e51-b46b-4d96-ba78-5073504fb9c5","Type":"ContainerDied","Data":"6cfa8ff92b9fc5c5329342e9bc946cab9b9a2b177fe10b5e8c7e606ea7c0551b"} Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.916120 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cfa8ff92b9fc5c5329342e9bc946cab9b9a2b177fe10b5e8c7e606ea7c0551b" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.917128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-62c6-account-create-update-z9fv5" event={"ID":"01e10248-9205-4bb2-be05-60aa2647f447","Type":"ContainerDied","Data":"c929467c7f214aa422233ceec9b9b07552e6f8ace4c8a044790825bd6bbf6e2e"} Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.917144 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c929467c7f214aa422233ceec9b9b07552e6f8ace4c8a044790825bd6bbf6e2e" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.917187 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-62c6-account-create-update-z9fv5" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.921716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e746-account-create-update-s9k4v" event={"ID":"2db7e1d7-27c8-4f26-9288-fb94302ba13b","Type":"ContainerDied","Data":"e7c4ea82df4d28d5d025cc1559277da10b9b27afa775a09993b894f7391b58f2"} Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.921774 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c4ea82df4d28d5d025cc1559277da10b9b27afa775a09993b894f7391b58f2" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.921752 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e746-account-create-update-s9k4v" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.923386 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-swv4z" event={"ID":"0046b5a1-ddfb-44d6-9a24-301c0cf61b75","Type":"ContainerDied","Data":"69417db1b3251e7edc5520a64a6b8b692eb16bf0ce52704dd29d03c5880bf9b4"} Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.923450 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69417db1b3251e7edc5520a64a6b8b692eb16bf0ce52704dd29d03c5880bf9b4" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.923512 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-swv4z" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.937957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-trkf9" event={"ID":"9d0c5c1a-19a4-475f-a810-71feb6ff1d5f","Type":"ContainerDied","Data":"38fdd46200ab652e2af2335b2cd169e4d2c3b1975a4924a672acc3f839eef683"} Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.938023 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38fdd46200ab652e2af2335b2cd169e4d2c3b1975a4924a672acc3f839eef683" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.938114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trkf9" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.938768 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4324488-aeee-4bcb-b62b-8b238db04a68-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.938937 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwdmc\" (UniqueName: \"kubernetes.io/projected/32a74e51-b46b-4d96-ba78-5073504fb9c5-kube-api-access-xwdmc\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.938970 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.938984 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htqsw\" (UniqueName: \"kubernetes.io/projected/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-kube-api-access-htqsw\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.938994 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8mk5\" (UniqueName: \"kubernetes.io/projected/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f-kube-api-access-b8mk5\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.939004 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvz5r\" (UniqueName: \"kubernetes.io/projected/f0521623-7e83-4fec-b9b1-3414ae979a0d-kube-api-access-nvz5r\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.939015 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62rmn\" (UniqueName: \"kubernetes.io/projected/f4324488-aeee-4bcb-b62b-8b238db04a68-kube-api-access-62rmn\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.939025 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a74e51-b46b-4d96-ba78-5073504fb9c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.939036 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-675tk\" (UniqueName: \"kubernetes.io/projected/01e10248-9205-4bb2-be05-60aa2647f447-kube-api-access-675tk\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.939046 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0046b5a1-ddfb-44d6-9a24-301c0cf61b75-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.939057 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e10248-9205-4bb2-be05-60aa2647f447-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.939068 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0521623-7e83-4fec-b9b1-3414ae979a0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.958365 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6wzp8" Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.961996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-6wzp8" event={"ID":"f4324488-aeee-4bcb-b62b-8b238db04a68","Type":"ContainerDied","Data":"f5ecae564421294ace8cba188cb4855056be697649c2977c6a9546ea49fb44ec"} Jan 29 03:45:41 crc kubenswrapper[4707]: I0129 03:45:41.962049 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ecae564421294ace8cba188cb4855056be697649c2977c6a9546ea49fb44ec" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.296455 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.359646 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.451779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-operator-scripts\") pod \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\" (UID: \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\") " Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.451911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2rlz\" (UniqueName: \"kubernetes.io/projected/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-kube-api-access-d2rlz\") pod \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\" (UID: \"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5\") " Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.454825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5" (UID: "cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.459576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-kube-api-access-d2rlz" (OuterVolumeSpecName: "kube-api-access-d2rlz") pod "cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5" (UID: "cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5"). InnerVolumeSpecName "kube-api-access-d2rlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.555621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7p5s\" (UniqueName: \"kubernetes.io/projected/04448c3b-f457-4bbb-977f-be67a9a4ba75-kube-api-access-k7p5s\") pod \"04448c3b-f457-4bbb-977f-be67a9a4ba75\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.555727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run-ovn\") pod \"04448c3b-f457-4bbb-977f-be67a9a4ba75\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.555755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-additional-scripts\") pod \"04448c3b-f457-4bbb-977f-be67a9a4ba75\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.555820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run\") pod \"04448c3b-f457-4bbb-977f-be67a9a4ba75\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.555873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-scripts\") pod \"04448c3b-f457-4bbb-977f-be67a9a4ba75\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.555866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "04448c3b-f457-4bbb-977f-be67a9a4ba75" (UID: "04448c3b-f457-4bbb-977f-be67a9a4ba75"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.555915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-log-ovn\") pod \"04448c3b-f457-4bbb-977f-be67a9a4ba75\" (UID: \"04448c3b-f457-4bbb-977f-be67a9a4ba75\") " Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.555936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run" (OuterVolumeSpecName: "var-run") pod "04448c3b-f457-4bbb-977f-be67a9a4ba75" (UID: "04448c3b-f457-4bbb-977f-be67a9a4ba75"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.556027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "04448c3b-f457-4bbb-977f-be67a9a4ba75" (UID: "04448c3b-f457-4bbb-977f-be67a9a4ba75"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.556399 4707 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.556419 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.556434 4707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-run\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.556445 4707 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04448c3b-f457-4bbb-977f-be67a9a4ba75-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.556457 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2rlz\" (UniqueName: \"kubernetes.io/projected/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5-kube-api-access-d2rlz\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.556921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "04448c3b-f457-4bbb-977f-be67a9a4ba75" (UID: "04448c3b-f457-4bbb-977f-be67a9a4ba75"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.557123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-scripts" (OuterVolumeSpecName: "scripts") pod "04448c3b-f457-4bbb-977f-be67a9a4ba75" (UID: "04448c3b-f457-4bbb-977f-be67a9a4ba75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.559826 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04448c3b-f457-4bbb-977f-be67a9a4ba75-kube-api-access-k7p5s" (OuterVolumeSpecName: "kube-api-access-k7p5s") pod "04448c3b-f457-4bbb-977f-be67a9a4ba75" (UID: "04448c3b-f457-4bbb-977f-be67a9a4ba75"). InnerVolumeSpecName "kube-api-access-k7p5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.657647 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7p5s\" (UniqueName: \"kubernetes.io/projected/04448c3b-f457-4bbb-977f-be67a9a4ba75-kube-api-access-k7p5s\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.658148 4707 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.658161 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04448c3b-f457-4bbb-977f-be67a9a4ba75-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.971438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w56h9" event={"ID":"8f60529f-12d2-41ed-8b70-6c63bdadcb55","Type":"ContainerStarted","Data":"0d69e262c34236f1538a9551f8682600000c85c5359856fdd0a4eb2a49ffc6e1"} Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.971522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w56h9" event={"ID":"8f60529f-12d2-41ed-8b70-6c63bdadcb55","Type":"ContainerStarted","Data":"1c7849b1aa1c6074b7be5a44d2c83c39bbf255102a8ee69da72bee785251cbca"} Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.973877 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d1f5-account-create-update-7p7sh" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.974355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d1f5-account-create-update-7p7sh" event={"ID":"cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5","Type":"ContainerDied","Data":"7a30f5d2a97533f3cc1c8a96e216b2c8cb4b61a2e3fed7810dccd78ae4b84672"} Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.974423 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a30f5d2a97533f3cc1c8a96e216b2c8cb4b61a2e3fed7810dccd78ae4b84672" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.977039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hpq5q-config-zws9h" event={"ID":"04448c3b-f457-4bbb-977f-be67a9a4ba75","Type":"ContainerDied","Data":"1ee3bc7f9390c02d9c32106f52a35b0876072efe0a706ec71334aad9564b935a"} Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.977077 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee3bc7f9390c02d9c32106f52a35b0876072efe0a706ec71334aad9564b935a" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.977172 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hpq5q-config-zws9h" Jan 29 03:45:42 crc kubenswrapper[4707]: I0129 03:45:42.994697 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-w56h9" podStartSLOduration=2.994669692 podStartE2EDuration="2.994669692s" podCreationTimestamp="2026-01-29 03:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:42.987494241 +0000 UTC m=+1096.471723146" watchObservedRunningTime="2026-01-29 03:45:42.994669692 +0000 UTC m=+1096.478898597" Jan 29 03:45:43 crc kubenswrapper[4707]: I0129 03:45:43.456047 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hpq5q-config-zws9h"] Jan 29 03:45:43 crc kubenswrapper[4707]: I0129 03:45:43.467354 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hpq5q-config-zws9h"] Jan 29 03:45:43 crc kubenswrapper[4707]: I0129 03:45:43.988229 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f60529f-12d2-41ed-8b70-6c63bdadcb55" containerID="0d69e262c34236f1538a9551f8682600000c85c5359856fdd0a4eb2a49ffc6e1" exitCode=0 Jan 29 03:45:43 crc kubenswrapper[4707]: I0129 03:45:43.988311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w56h9" event={"ID":"8f60529f-12d2-41ed-8b70-6c63bdadcb55","Type":"ContainerDied","Data":"0d69e262c34236f1538a9551f8682600000c85c5359856fdd0a4eb2a49ffc6e1"} Jan 29 03:45:45 crc kubenswrapper[4707]: I0129 03:45:45.003064 4707 generic.go:334] "Generic (PLEG): container finished" podID="d65e0ca5-1e58-4492-bd8d-92ff6d516014" containerID="00c45d457cee63fe9383969a0ffe0a8b523bcecdf1d1ad3985daaadbb025260c" exitCode=0 Jan 29 03:45:45 crc kubenswrapper[4707]: I0129 03:45:45.003222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m74xr" event={"ID":"d65e0ca5-1e58-4492-bd8d-92ff6d516014","Type":"ContainerDied","Data":"00c45d457cee63fe9383969a0ffe0a8b523bcecdf1d1ad3985daaadbb025260c"} Jan 29 03:45:45 crc kubenswrapper[4707]: I0129 03:45:45.259460 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04448c3b-f457-4bbb-977f-be67a9a4ba75" path="/var/lib/kubelet/pods/04448c3b-f457-4bbb-977f-be67a9a4ba75/volumes" Jan 29 03:45:45 crc kubenswrapper[4707]: I0129 03:45:45.988036 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.035271 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w56h9" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.035810 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w56h9" event={"ID":"8f60529f-12d2-41ed-8b70-6c63bdadcb55","Type":"ContainerDied","Data":"1c7849b1aa1c6074b7be5a44d2c83c39bbf255102a8ee69da72bee785251cbca"} Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.035867 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c7849b1aa1c6074b7be5a44d2c83c39bbf255102a8ee69da72bee785251cbca" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.149363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q27pl\" (UniqueName: \"kubernetes.io/projected/8f60529f-12d2-41ed-8b70-6c63bdadcb55-kube-api-access-q27pl\") pod \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\" (UID: \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\") " Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.149546 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f60529f-12d2-41ed-8b70-6c63bdadcb55-operator-scripts\") pod \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\" (UID: \"8f60529f-12d2-41ed-8b70-6c63bdadcb55\") " Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.150662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f60529f-12d2-41ed-8b70-6c63bdadcb55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f60529f-12d2-41ed-8b70-6c63bdadcb55" (UID: "8f60529f-12d2-41ed-8b70-6c63bdadcb55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.160841 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f60529f-12d2-41ed-8b70-6c63bdadcb55-kube-api-access-q27pl" (OuterVolumeSpecName: "kube-api-access-q27pl") pod "8f60529f-12d2-41ed-8b70-6c63bdadcb55" (UID: "8f60529f-12d2-41ed-8b70-6c63bdadcb55"). InnerVolumeSpecName "kube-api-access-q27pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.251823 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q27pl\" (UniqueName: \"kubernetes.io/projected/8f60529f-12d2-41ed-8b70-6c63bdadcb55-kube-api-access-q27pl\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.251856 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f60529f-12d2-41ed-8b70-6c63bdadcb55-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.438490 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.557508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-db-sync-config-data\") pod \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.557677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-config-data\") pod \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.557715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82t66\" (UniqueName: \"kubernetes.io/projected/d65e0ca5-1e58-4492-bd8d-92ff6d516014-kube-api-access-82t66\") pod \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.557762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-combined-ca-bundle\") pod \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\" (UID: \"d65e0ca5-1e58-4492-bd8d-92ff6d516014\") " Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.562491 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d65e0ca5-1e58-4492-bd8d-92ff6d516014" (UID: "d65e0ca5-1e58-4492-bd8d-92ff6d516014"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.568836 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65e0ca5-1e58-4492-bd8d-92ff6d516014-kube-api-access-82t66" (OuterVolumeSpecName: "kube-api-access-82t66") pod "d65e0ca5-1e58-4492-bd8d-92ff6d516014" (UID: "d65e0ca5-1e58-4492-bd8d-92ff6d516014"). InnerVolumeSpecName "kube-api-access-82t66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.598390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d65e0ca5-1e58-4492-bd8d-92ff6d516014" (UID: "d65e0ca5-1e58-4492-bd8d-92ff6d516014"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.632988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-config-data" (OuterVolumeSpecName: "config-data") pod "d65e0ca5-1e58-4492-bd8d-92ff6d516014" (UID: "d65e0ca5-1e58-4492-bd8d-92ff6d516014"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.660234 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.660280 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82t66\" (UniqueName: \"kubernetes.io/projected/d65e0ca5-1e58-4492-bd8d-92ff6d516014-kube-api-access-82t66\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.660297 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:46 crc kubenswrapper[4707]: I0129 03:45:46.660308 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d65e0ca5-1e58-4492-bd8d-92ff6d516014-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.045331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wwvj2" event={"ID":"f945e029-2a96-43ab-93aa-556eeadfda35","Type":"ContainerStarted","Data":"d7bc57e63ee48165733e5e5556190fa9ad2e5fa814ab960b965446b905cdf70c"} Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.053244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"08287a6b5b7e0dee6d9b2063ed391e79bd139655fc805ed0a09732d4d9db5414"} Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.053311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"5029df5b0e68065abba88356d50b66afbfd5733da22a2f513792f160e76485a9"} Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.053327 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"a7a23fbc6c58465f1e3c384fb8c79847692cadb1bf8826cba023a013b5ef1c71"} Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.053344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"2e1629fa17935fc5ef3f65488973b575533ec3516109908280e31f2acabd0bbd"} Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.058280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m74xr" event={"ID":"d65e0ca5-1e58-4492-bd8d-92ff6d516014","Type":"ContainerDied","Data":"ba189fd8fd2c4bf629df91d397dbbaecdb9387cdb5700245c24f9169e7bd9554"} Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.058331 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba189fd8fd2c4bf629df91d397dbbaecdb9387cdb5700245c24f9169e7bd9554" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.058449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m74xr" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.095389 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wwvj2" podStartSLOduration=3.363828385 podStartE2EDuration="11.095353938s" podCreationTimestamp="2026-01-29 03:45:36 +0000 UTC" firstStartedPulling="2026-01-29 03:45:38.069052394 +0000 UTC m=+1091.553281299" lastFinishedPulling="2026-01-29 03:45:45.800577947 +0000 UTC m=+1099.284806852" observedRunningTime="2026-01-29 03:45:47.068804772 +0000 UTC m=+1100.553033687" watchObservedRunningTime="2026-01-29 03:45:47.095353938 +0000 UTC m=+1100.579582843" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.123973 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-w56h9"] Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.131032 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-w56h9"] Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.264834 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f60529f-12d2-41ed-8b70-6c63bdadcb55" path="/var/lib/kubelet/pods/8f60529f-12d2-41ed-8b70-6c63bdadcb55/volumes" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.529656 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-z2kk8"] Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530093 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db7e1d7-27c8-4f26-9288-fb94302ba13b" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530112 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db7e1d7-27c8-4f26-9288-fb94302ba13b" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530124 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0046b5a1-ddfb-44d6-9a24-301c0cf61b75" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530132 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0046b5a1-ddfb-44d6-9a24-301c0cf61b75" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530141 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0c5c1a-19a4-475f-a810-71feb6ff1d5f" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530147 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0c5c1a-19a4-475f-a810-71feb6ff1d5f" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530161 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a74e51-b46b-4d96-ba78-5073504fb9c5" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530168 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a74e51-b46b-4d96-ba78-5073504fb9c5" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530183 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e10248-9205-4bb2-be05-60aa2647f447" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530190 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e10248-9205-4bb2-be05-60aa2647f447" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530201 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04448c3b-f457-4bbb-977f-be67a9a4ba75" containerName="ovn-config" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530208 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="04448c3b-f457-4bbb-977f-be67a9a4ba75" containerName="ovn-config" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530221 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530227 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530238 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65e0ca5-1e58-4492-bd8d-92ff6d516014" containerName="glance-db-sync" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530246 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65e0ca5-1e58-4492-bd8d-92ff6d516014" containerName="glance-db-sync" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530256 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f60529f-12d2-41ed-8b70-6c63bdadcb55" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530262 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f60529f-12d2-41ed-8b70-6c63bdadcb55" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530277 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0521623-7e83-4fec-b9b1-3414ae979a0d" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530284 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0521623-7e83-4fec-b9b1-3414ae979a0d" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: E0129 03:45:47.530295 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4324488-aeee-4bcb-b62b-8b238db04a68" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530302 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4324488-aeee-4bcb-b62b-8b238db04a68" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530454 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0521623-7e83-4fec-b9b1-3414ae979a0d" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530466 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="04448c3b-f457-4bbb-977f-be67a9a4ba75" containerName="ovn-config" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530477 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530484 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db7e1d7-27c8-4f26-9288-fb94302ba13b" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530497 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a74e51-b46b-4d96-ba78-5073504fb9c5" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530507 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e10248-9205-4bb2-be05-60aa2647f447" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530517 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65e0ca5-1e58-4492-bd8d-92ff6d516014" containerName="glance-db-sync" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530525 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0046b5a1-ddfb-44d6-9a24-301c0cf61b75" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530531 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f60529f-12d2-41ed-8b70-6c63bdadcb55" containerName="mariadb-account-create-update" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530544 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0c5c1a-19a4-475f-a810-71feb6ff1d5f" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.530553 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4324488-aeee-4bcb-b62b-8b238db04a68" containerName="mariadb-database-create" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.531615 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.552181 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-z2kk8"] Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.700510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.700987 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvsg9\" (UniqueName: \"kubernetes.io/projected/f5476114-774c-414f-9d4b-92617beae345-kube-api-access-kvsg9\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.701026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.701065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.701146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-config\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.802647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvsg9\" (UniqueName: \"kubernetes.io/projected/f5476114-774c-414f-9d4b-92617beae345-kube-api-access-kvsg9\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.802709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.802751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.802789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-config\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.802828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.803896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.803971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-config\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.804135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.804177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.830803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvsg9\" (UniqueName: \"kubernetes.io/projected/f5476114-774c-414f-9d4b-92617beae345-kube-api-access-kvsg9\") pod \"dnsmasq-dns-5b946c75cc-z2kk8\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:47 crc kubenswrapper[4707]: I0129 03:45:47.867424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:48 crc kubenswrapper[4707]: I0129 03:45:48.086773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"1a2cd072e0f9f0782d64d25aeb021ad9a25a3ee54e82a4108b54253d8c6774db"} Jan 29 03:45:48 crc kubenswrapper[4707]: I0129 03:45:48.148343 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-z2kk8"] Jan 29 03:45:49 crc kubenswrapper[4707]: I0129 03:45:49.097287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" event={"ID":"f5476114-774c-414f-9d4b-92617beae345","Type":"ContainerStarted","Data":"4947f92317d684340605efa541f17a7c79ddb42112a1776060d1348a025f7179"} Jan 29 03:45:49 crc kubenswrapper[4707]: I0129 03:45:49.105587 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"cd04bdcdf7c0ee725a992237dfd1a6f9a6375d2719392d216d1f00165133cf53"} Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.521942 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dm5xt"] Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.524602 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.527839 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.529101 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dm5xt"] Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.576029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-operator-scripts\") pod \"root-account-create-update-dm5xt\" (UID: \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\") " pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.576140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bczkt\" (UniqueName: \"kubernetes.io/projected/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-kube-api-access-bczkt\") pod \"root-account-create-update-dm5xt\" (UID: \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\") " pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.678992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bczkt\" (UniqueName: \"kubernetes.io/projected/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-kube-api-access-bczkt\") pod \"root-account-create-update-dm5xt\" (UID: \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\") " pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.679170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-operator-scripts\") pod \"root-account-create-update-dm5xt\" (UID: \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\") " pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.680026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-operator-scripts\") pod \"root-account-create-update-dm5xt\" (UID: \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\") " pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.708212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bczkt\" (UniqueName: \"kubernetes.io/projected/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-kube-api-access-bczkt\") pod \"root-account-create-update-dm5xt\" (UID: \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\") " pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:50 crc kubenswrapper[4707]: I0129 03:45:50.889759 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.124949 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5476114-774c-414f-9d4b-92617beae345" containerID="a1d2aa80360a12cdeff97551fb6816161a8b682cde4ea5793a64436deedb4c2d" exitCode=0 Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.125066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" event={"ID":"f5476114-774c-414f-9d4b-92617beae345","Type":"ContainerDied","Data":"a1d2aa80360a12cdeff97551fb6816161a8b682cde4ea5793a64436deedb4c2d"} Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.135984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"249edadf-1bb4-4d39-aae3-40384ba10bae","Type":"ContainerStarted","Data":"4e0473d45dfa3772d64c839852f2877710351e7118d53fb20468977f13dbcc0c"} Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.214451 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=29.379750265 podStartE2EDuration="41.21440483s" podCreationTimestamp="2026-01-29 03:45:10 +0000 UTC" firstStartedPulling="2026-01-29 03:45:33.953642067 +0000 UTC m=+1087.437870972" lastFinishedPulling="2026-01-29 03:45:45.788296632 +0000 UTC m=+1099.272525537" observedRunningTime="2026-01-29 03:45:51.199570228 +0000 UTC m=+1104.683799143" watchObservedRunningTime="2026-01-29 03:45:51.21440483 +0000 UTC m=+1104.698633735" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.361019 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dm5xt"] Jan 29 03:45:51 crc kubenswrapper[4707]: W0129 03:45:51.363390 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51d16a39_52da_4b3a_bf17_5e18ddbdf13a.slice/crio-dfd131042f008376250f6234ea22a9d1403e2054106e8b2b59d32b2821f405da WatchSource:0}: Error finding container dfd131042f008376250f6234ea22a9d1403e2054106e8b2b59d32b2821f405da: Status 404 returned error can't find the container with id dfd131042f008376250f6234ea22a9d1403e2054106e8b2b59d32b2821f405da Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.489198 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-z2kk8"] Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.523432 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-4m8p6"] Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.525304 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.527610 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.542789 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-4m8p6"] Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.699336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.699417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-config\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.699462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.699547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.699837 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97gg\" (UniqueName: \"kubernetes.io/projected/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-kube-api-access-t97gg\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.699890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.801265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-config\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.801339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.801376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.801444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97gg\" (UniqueName: \"kubernetes.io/projected/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-kube-api-access-t97gg\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.801474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.802311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-config\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.802649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.802649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.802767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.803258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.803471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.826116 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97gg\" (UniqueName: \"kubernetes.io/projected/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-kube-api-access-t97gg\") pod \"dnsmasq-dns-74f6bcbc87-4m8p6\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:51 crc kubenswrapper[4707]: I0129 03:45:51.863998 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:52 crc kubenswrapper[4707]: I0129 03:45:52.148520 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" event={"ID":"f5476114-774c-414f-9d4b-92617beae345","Type":"ContainerStarted","Data":"a1d467c14530202c27f0293155c3a995c7f2a6a0d58448a4c468aaef84b550c6"} Jan 29 03:45:52 crc kubenswrapper[4707]: I0129 03:45:52.148992 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:52 crc kubenswrapper[4707]: I0129 03:45:52.150663 4707 generic.go:334] "Generic (PLEG): container finished" podID="51d16a39-52da-4b3a-bf17-5e18ddbdf13a" containerID="870e2315d75415233d0c33e4c9d9bb6fccd5be82d0c4ae21b9404170147f8962" exitCode=0 Jan 29 03:45:52 crc kubenswrapper[4707]: I0129 03:45:52.150738 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dm5xt" event={"ID":"51d16a39-52da-4b3a-bf17-5e18ddbdf13a","Type":"ContainerDied","Data":"870e2315d75415233d0c33e4c9d9bb6fccd5be82d0c4ae21b9404170147f8962"} Jan 29 03:45:52 crc kubenswrapper[4707]: I0129 03:45:52.150768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dm5xt" event={"ID":"51d16a39-52da-4b3a-bf17-5e18ddbdf13a","Type":"ContainerStarted","Data":"dfd131042f008376250f6234ea22a9d1403e2054106e8b2b59d32b2821f405da"} Jan 29 03:45:52 crc kubenswrapper[4707]: I0129 03:45:52.152195 4707 generic.go:334] "Generic (PLEG): container finished" podID="f945e029-2a96-43ab-93aa-556eeadfda35" containerID="d7bc57e63ee48165733e5e5556190fa9ad2e5fa814ab960b965446b905cdf70c" exitCode=0 Jan 29 03:45:52 crc kubenswrapper[4707]: I0129 03:45:52.152274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wwvj2" event={"ID":"f945e029-2a96-43ab-93aa-556eeadfda35","Type":"ContainerDied","Data":"d7bc57e63ee48165733e5e5556190fa9ad2e5fa814ab960b965446b905cdf70c"} Jan 29 03:45:52 crc kubenswrapper[4707]: I0129 03:45:52.176034 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" podStartSLOduration=5.176012324 podStartE2EDuration="5.176012324s" podCreationTimestamp="2026-01-29 03:45:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:52.169671493 +0000 UTC m=+1105.653900398" watchObservedRunningTime="2026-01-29 03:45:52.176012324 +0000 UTC m=+1105.660241229" Jan 29 03:45:52 crc kubenswrapper[4707]: I0129 03:45:52.357882 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-4m8p6"] Jan 29 03:45:52 crc kubenswrapper[4707]: W0129 03:45:52.366298 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae8cbbd2_8ffb_4a7e_b838_f86d415ebc74.slice/crio-0188244b4bb5de10df184679a9516602a8f1a3dc97559b6443a0e351971f02d0 WatchSource:0}: Error finding container 0188244b4bb5de10df184679a9516602a8f1a3dc97559b6443a0e351971f02d0: Status 404 returned error can't find the container with id 0188244b4bb5de10df184679a9516602a8f1a3dc97559b6443a0e351971f02d0 Jan 29 03:45:53 crc kubenswrapper[4707]: I0129 03:45:53.163990 4707 generic.go:334] "Generic (PLEG): container finished" podID="ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" containerID="56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2" exitCode=0 Jan 29 03:45:53 crc kubenswrapper[4707]: I0129 03:45:53.164062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" event={"ID":"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74","Type":"ContainerDied","Data":"56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2"} Jan 29 03:45:53 crc kubenswrapper[4707]: I0129 03:45:53.164446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" event={"ID":"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74","Type":"ContainerStarted","Data":"0188244b4bb5de10df184679a9516602a8f1a3dc97559b6443a0e351971f02d0"} Jan 29 03:45:53 crc kubenswrapper[4707]: I0129 03:45:53.164725 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" podUID="f5476114-774c-414f-9d4b-92617beae345" containerName="dnsmasq-dns" containerID="cri-o://a1d467c14530202c27f0293155c3a995c7f2a6a0d58448a4c468aaef84b550c6" gracePeriod=10 Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.650268 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.658387 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.744909 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bczkt\" (UniqueName: \"kubernetes.io/projected/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-kube-api-access-bczkt\") pod \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\" (UID: \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\") " Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.745631 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-operator-scripts\") pod \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\" (UID: \"51d16a39-52da-4b3a-bf17-5e18ddbdf13a\") " Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.746358 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51d16a39-52da-4b3a-bf17-5e18ddbdf13a" (UID: "51d16a39-52da-4b3a-bf17-5e18ddbdf13a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.751673 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-kube-api-access-bczkt" (OuterVolumeSpecName: "kube-api-access-bczkt") pod "51d16a39-52da-4b3a-bf17-5e18ddbdf13a" (UID: "51d16a39-52da-4b3a-bf17-5e18ddbdf13a"). InnerVolumeSpecName "kube-api-access-bczkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.847504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-config-data\") pod \"f945e029-2a96-43ab-93aa-556eeadfda35\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.847624 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5s9s\" (UniqueName: \"kubernetes.io/projected/f945e029-2a96-43ab-93aa-556eeadfda35-kube-api-access-m5s9s\") pod \"f945e029-2a96-43ab-93aa-556eeadfda35\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.847712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-combined-ca-bundle\") pod \"f945e029-2a96-43ab-93aa-556eeadfda35\" (UID: \"f945e029-2a96-43ab-93aa-556eeadfda35\") " Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.848219 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bczkt\" (UniqueName: \"kubernetes.io/projected/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-kube-api-access-bczkt\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.848232 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d16a39-52da-4b3a-bf17-5e18ddbdf13a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.852295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f945e029-2a96-43ab-93aa-556eeadfda35-kube-api-access-m5s9s" (OuterVolumeSpecName: "kube-api-access-m5s9s") pod "f945e029-2a96-43ab-93aa-556eeadfda35" (UID: "f945e029-2a96-43ab-93aa-556eeadfda35"). InnerVolumeSpecName "kube-api-access-m5s9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.890977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f945e029-2a96-43ab-93aa-556eeadfda35" (UID: "f945e029-2a96-43ab-93aa-556eeadfda35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.891516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-config-data" (OuterVolumeSpecName: "config-data") pod "f945e029-2a96-43ab-93aa-556eeadfda35" (UID: "f945e029-2a96-43ab-93aa-556eeadfda35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.950129 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.950166 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f945e029-2a96-43ab-93aa-556eeadfda35-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:53.950177 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5s9s\" (UniqueName: \"kubernetes.io/projected/f945e029-2a96-43ab-93aa-556eeadfda35-kube-api-access-m5s9s\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.178409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wwvj2" event={"ID":"f945e029-2a96-43ab-93aa-556eeadfda35","Type":"ContainerDied","Data":"fd9095dd1b23c85a4da1bab34593bc8595534d1ebadaa74c7f9efb251d1ea1f7"} Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.178482 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd9095dd1b23c85a4da1bab34593bc8595534d1ebadaa74c7f9efb251d1ea1f7" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.178436 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wwvj2" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.181418 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5476114-774c-414f-9d4b-92617beae345" containerID="a1d467c14530202c27f0293155c3a995c7f2a6a0d58448a4c468aaef84b550c6" exitCode=0 Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.181486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" event={"ID":"f5476114-774c-414f-9d4b-92617beae345","Type":"ContainerDied","Data":"a1d467c14530202c27f0293155c3a995c7f2a6a0d58448a4c468aaef84b550c6"} Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.184220 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" event={"ID":"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74","Type":"ContainerStarted","Data":"78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0"} Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.184520 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.190454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dm5xt" event={"ID":"51d16a39-52da-4b3a-bf17-5e18ddbdf13a","Type":"ContainerDied","Data":"dfd131042f008376250f6234ea22a9d1403e2054106e8b2b59d32b2821f405da"} Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.190495 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd131042f008376250f6234ea22a9d1403e2054106e8b2b59d32b2821f405da" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.190548 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dm5xt" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.232108 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" podStartSLOduration=3.232081006 podStartE2EDuration="3.232081006s" podCreationTimestamp="2026-01-29 03:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:54.205438127 +0000 UTC m=+1107.689667052" watchObservedRunningTime="2026-01-29 03:45:54.232081006 +0000 UTC m=+1107.716309911" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.499764 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-4m8p6"] Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.519862 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4fcwr"] Jan 29 03:45:54 crc kubenswrapper[4707]: E0129 03:45:54.520406 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d16a39-52da-4b3a-bf17-5e18ddbdf13a" containerName="mariadb-account-create-update" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.520419 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d16a39-52da-4b3a-bf17-5e18ddbdf13a" containerName="mariadb-account-create-update" Jan 29 03:45:54 crc kubenswrapper[4707]: E0129 03:45:54.520431 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f945e029-2a96-43ab-93aa-556eeadfda35" containerName="keystone-db-sync" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.520438 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f945e029-2a96-43ab-93aa-556eeadfda35" containerName="keystone-db-sync" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.520641 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f945e029-2a96-43ab-93aa-556eeadfda35" containerName="keystone-db-sync" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.520669 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d16a39-52da-4b3a-bf17-5e18ddbdf13a" containerName="mariadb-account-create-update" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.521824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.554194 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kh5gh"] Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.556787 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.559564 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.559878 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.562463 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.562688 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.562850 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdlvf" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.641396 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4fcwr"] Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.674251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-config-data\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.674305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-scripts\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.674332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-config\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.674352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.674386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.674436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-combined-ca-bundle\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.675881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-fernet-keys\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.676016 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw8b4\" (UniqueName: \"kubernetes.io/projected/70d1fe5e-caa0-4cb3-bf06-18756581018a-kube-api-access-hw8b4\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.676112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-svc\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.676178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-credential-keys\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.676216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvkjj\" (UniqueName: \"kubernetes.io/projected/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-kube-api-access-gvkjj\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.676415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.684839 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kh5gh"] Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.758634 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-wbtsz"] Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.760277 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.766064 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.766305 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-xg84d" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-svc\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-credential-keys\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvkjj\" (UniqueName: \"kubernetes.io/projected/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-kube-api-access-gvkjj\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-config-data\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781449 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-scripts\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-config\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-combined-ca-bundle\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-fernet-keys\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.781677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw8b4\" (UniqueName: \"kubernetes.io/projected/70d1fe5e-caa0-4cb3-bf06-18756581018a-kube-api-access-hw8b4\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.782922 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-svc\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.787151 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.787730 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.788399 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wbtsz"] Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.790446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.796518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-config\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.801771 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-fernet-keys\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.802352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-credential-keys\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.802712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-config-data\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.809929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-scripts\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.818138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-combined-ca-bundle\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.855943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw8b4\" (UniqueName: \"kubernetes.io/projected/70d1fe5e-caa0-4cb3-bf06-18756581018a-kube-api-access-hw8b4\") pod \"dnsmasq-dns-847c4cc679-4fcwr\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.865812 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvkjj\" (UniqueName: \"kubernetes.io/projected/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-kube-api-access-gvkjj\") pod \"keystone-bootstrap-kh5gh\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.872305 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gzpxq"] Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.874253 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.878966 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tgh4r" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.879368 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.879612 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.889204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-config-data\") pod \"heat-db-sync-wbtsz\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.889672 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8qrk\" (UniqueName: \"kubernetes.io/projected/6808d614-6634-4b2a-9e78-7480a0921415-kube-api-access-b8qrk\") pod \"heat-db-sync-wbtsz\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.889833 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-combined-ca-bundle\") pod \"heat-db-sync-wbtsz\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.906449 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gzpxq"] Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.936124 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.975762 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-2hmmz"] Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.982829 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.997226 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.997711 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8vm9d" Jan 29 03:45:54 crc kubenswrapper[4707]: I0129 03:45:54.998355 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.005490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx4fg\" (UniqueName: \"kubernetes.io/projected/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-kube-api-access-kx4fg\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.005563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8qrk\" (UniqueName: \"kubernetes.io/projected/6808d614-6634-4b2a-9e78-7480a0921415-kube-api-access-b8qrk\") pod \"heat-db-sync-wbtsz\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.005604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-db-sync-config-data\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.005621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-scripts\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.005662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-combined-ca-bundle\") pod \"heat-db-sync-wbtsz\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.005715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-etc-machine-id\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.005753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-config-data\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.005795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-config-data\") pod \"heat-db-sync-wbtsz\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.005817 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-combined-ca-bundle\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.006497 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-2hmmz"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.012480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-combined-ca-bundle\") pod \"heat-db-sync-wbtsz\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.024144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-config-data\") pod \"heat-db-sync-wbtsz\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.027725 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-9gbdt"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.029438 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.037361 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.037951 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-c7t8z" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.038267 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.040676 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4fcwr"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.041823 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.050136 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8qrk\" (UniqueName: \"kubernetes.io/projected/6808d614-6634-4b2a-9e78-7480a0921415-kube-api-access-b8qrk\") pod \"heat-db-sync-wbtsz\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.077718 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9gbdt"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.099205 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wbtsz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.108086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx4fg\" (UniqueName: \"kubernetes.io/projected/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-kube-api-access-kx4fg\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.108148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-db-sync-config-data\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.108172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-scripts\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.108236 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-etc-machine-id\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.108262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-config\") pod \"neutron-db-sync-2hmmz\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.108296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-config-data\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.108317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfmrc\" (UniqueName: \"kubernetes.io/projected/42399f3f-f8c5-45dc-b192-b3a8997c4636-kube-api-access-kfmrc\") pod \"neutron-db-sync-2hmmz\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.108349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-combined-ca-bundle\") pod \"neutron-db-sync-2hmmz\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.108369 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-combined-ca-bundle\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.111176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-etc-machine-id\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.112206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-combined-ca-bundle\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.112496 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gjzwn"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.113930 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.117528 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-px95z" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.117888 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.128093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-db-sync-config-data\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.133852 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-scripts\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.134787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-config-data\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.142588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx4fg\" (UniqueName: \"kubernetes.io/projected/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-kube-api-access-kx4fg\") pod \"cinder-db-sync-gzpxq\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.142675 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nqp7c"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.146217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.182649 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gjzwn"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.200761 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nqp7c"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.211594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-config-data\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.211690 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fb185b-2bb2-4cc2-8572-b38db5027edb-logs\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.211774 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9r9b\" (UniqueName: \"kubernetes.io/projected/3a2cf721-9a8b-49ab-9e57-1337f407db4f-kube-api-access-m9r9b\") pod \"barbican-db-sync-gjzwn\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.211817 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-combined-ca-bundle\") pod \"barbican-db-sync-gjzwn\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.211869 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-db-sync-config-data\") pod \"barbican-db-sync-gjzwn\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.211943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-config\") pod \"neutron-db-sync-2hmmz\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.211981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvznl\" (UniqueName: \"kubernetes.io/projected/c3fb185b-2bb2-4cc2-8572-b38db5027edb-kube-api-access-fvznl\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.212004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-scripts\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.212022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfmrc\" (UniqueName: \"kubernetes.io/projected/42399f3f-f8c5-45dc-b192-b3a8997c4636-kube-api-access-kfmrc\") pod \"neutron-db-sync-2hmmz\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.212074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-combined-ca-bundle\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.212105 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-combined-ca-bundle\") pod \"neutron-db-sync-2hmmz\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.238805 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-combined-ca-bundle\") pod \"neutron-db-sync-2hmmz\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.241501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-config\") pod \"neutron-db-sync-2hmmz\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.272036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfmrc\" (UniqueName: \"kubernetes.io/projected/42399f3f-f8c5-45dc-b192-b3a8997c4636-kube-api-access-kfmrc\") pod \"neutron-db-sync-2hmmz\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.299610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9r9b\" (UniqueName: \"kubernetes.io/projected/3a2cf721-9a8b-49ab-9e57-1337f407db4f-kube-api-access-m9r9b\") pod \"barbican-db-sync-gjzwn\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-combined-ca-bundle\") pod \"barbican-db-sync-gjzwn\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313703 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-db-sync-config-data\") pod \"barbican-db-sync-gjzwn\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313813 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvznl\" (UniqueName: \"kubernetes.io/projected/c3fb185b-2bb2-4cc2-8572-b38db5027edb-kube-api-access-fvznl\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcws\" (UniqueName: \"kubernetes.io/projected/c6de908f-0466-47f1-9a1f-bd9a306e98ac-kube-api-access-4wcws\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-scripts\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-combined-ca-bundle\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-config-data\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.313998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fb185b-2bb2-4cc2-8572-b38db5027edb-logs\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.314017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-config\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.318214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-combined-ca-bundle\") pod \"barbican-db-sync-gjzwn\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.318757 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-db-sync-config-data\") pod \"barbican-db-sync-gjzwn\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.321045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-scripts\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.321983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-combined-ca-bundle\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.324106 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-config-data\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.332377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fb185b-2bb2-4cc2-8572-b38db5027edb-logs\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.344678 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.353985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvznl\" (UniqueName: \"kubernetes.io/projected/c3fb185b-2bb2-4cc2-8572-b38db5027edb-kube-api-access-fvznl\") pod \"placement-db-sync-9gbdt\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.354470 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9gbdt" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.360166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9r9b\" (UniqueName: \"kubernetes.io/projected/3a2cf721-9a8b-49ab-9e57-1337f407db4f-kube-api-access-m9r9b\") pod \"barbican-db-sync-gjzwn\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.415891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-config\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.415974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.415999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.416030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.416118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcws\" (UniqueName: \"kubernetes.io/projected/c6de908f-0466-47f1-9a1f-bd9a306e98ac-kube-api-access-4wcws\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.416144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.417513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.418046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.419823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.421025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-config\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.421298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.456625 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.456907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcws\" (UniqueName: \"kubernetes.io/projected/c6de908f-0466-47f1-9a1f-bd9a306e98ac-kube-api-access-4wcws\") pod \"dnsmasq-dns-785d8bcb8c-nqp7c\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.489774 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.567912 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.649245 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:45:55 crc kubenswrapper[4707]: E0129 03:45:55.649892 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5476114-774c-414f-9d4b-92617beae345" containerName="dnsmasq-dns" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.678361 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5476114-774c-414f-9d4b-92617beae345" containerName="dnsmasq-dns" Jan 29 03:45:55 crc kubenswrapper[4707]: E0129 03:45:55.678494 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5476114-774c-414f-9d4b-92617beae345" containerName="init" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.678502 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5476114-774c-414f-9d4b-92617beae345" containerName="init" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.679255 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5476114-774c-414f-9d4b-92617beae345" containerName="dnsmasq-dns" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.691157 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.703985 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tqq24" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.720120 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.720512 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.720744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.739957 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-dns-svc\") pod \"f5476114-774c-414f-9d4b-92617beae345\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.740103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-config\") pod \"f5476114-774c-414f-9d4b-92617beae345\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.740131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-nb\") pod \"f5476114-774c-414f-9d4b-92617beae345\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.740281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvsg9\" (UniqueName: \"kubernetes.io/projected/f5476114-774c-414f-9d4b-92617beae345-kube-api-access-kvsg9\") pod \"f5476114-774c-414f-9d4b-92617beae345\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.740364 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-sb\") pod \"f5476114-774c-414f-9d4b-92617beae345\" (UID: \"f5476114-774c-414f-9d4b-92617beae345\") " Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.759178 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.834778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5476114-774c-414f-9d4b-92617beae345-kube-api-access-kvsg9" (OuterVolumeSpecName: "kube-api-access-kvsg9") pod "f5476114-774c-414f-9d4b-92617beae345" (UID: "f5476114-774c-414f-9d4b-92617beae345"). InnerVolumeSpecName "kube-api-access-kvsg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.843636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.843726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljgv\" (UniqueName: \"kubernetes.io/projected/2ab1334d-c75f-418c-b493-e8ce0ce95bef-kube-api-access-tljgv\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.843772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.843810 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.843848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.843876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-logs\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.843896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.843929 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.843994 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvsg9\" (UniqueName: \"kubernetes.io/projected/f5476114-774c-414f-9d4b-92617beae345-kube-api-access-kvsg9\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.899487 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5476114-774c-414f-9d4b-92617beae345" (UID: "f5476114-774c-414f-9d4b-92617beae345"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.918327 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-config" (OuterVolumeSpecName: "config") pod "f5476114-774c-414f-9d4b-92617beae345" (UID: "f5476114-774c-414f-9d4b-92617beae345"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.946024 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.948118 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tljgv\" (UniqueName: \"kubernetes.io/projected/2ab1334d-c75f-418c-b493-e8ce0ce95bef-kube-api-access-tljgv\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950822 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-logs\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950863 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950894 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950954 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.950965 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.951380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.952248 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.958275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.969236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-logs\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.970331 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.975713 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.975923 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.976399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.980480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:55 crc kubenswrapper[4707]: I0129 03:45:55.983894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5476114-774c-414f-9d4b-92617beae345" (UID: "f5476114-774c-414f-9d4b-92617beae345"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.001037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.048397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljgv\" (UniqueName: \"kubernetes.io/projected/2ab1334d-c75f-418c-b493-e8ce0ce95bef-kube-api-access-tljgv\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.052426 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.056852 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5476114-774c-414f-9d4b-92617beae345" (UID: "f5476114-774c-414f-9d4b-92617beae345"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.087222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " pod="openstack/glance-default-external-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.088860 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4fcwr"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.105183 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kh5gh"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.166617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.167140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.167222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9q5v\" (UniqueName: \"kubernetes.io/projected/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-kube-api-access-n9q5v\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.167271 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-logs\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.167319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.167378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.167420 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.167457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.167563 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5476114-774c-414f-9d4b-92617beae345-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.257496 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.261308 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.269288 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.270490 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.272217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.272262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.272982 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.283751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kh5gh" event={"ID":"08bfd852-d6e3-4e65-845b-f2489a6e0fb1","Type":"ContainerStarted","Data":"e83ff0cf2103a1c0f5dc68fa3b4b185ca231867e8cb7ed822e3e0d469ffdce6e"} Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.287607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-scripts\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.287736 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8p55\" (UniqueName: \"kubernetes.io/projected/fef3f370-7e00-4c47-be4a-23a8919e0c89-kube-api-access-c8p55\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.287815 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.287885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.288007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.288188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.288278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-run-httpd\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.288418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.288644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9q5v\" (UniqueName: \"kubernetes.io/projected/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-kube-api-access-n9q5v\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.288766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-config-data\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.288876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-log-httpd\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.288980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-logs\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.289087 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.290053 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.290600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.291926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-logs\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.305522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.305709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" event={"ID":"70d1fe5e-caa0-4cb3-bf06-18756581018a","Type":"ContainerStarted","Data":"96acd47bd370e4ae4a9e215478202023e0870381500602bfc6896a26d4572ffd"} Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.309942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.313954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.314906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: W0129 03:45:56.320605 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6808d614_6634_4b2a_9e78_7480a0921415.slice/crio-f3fe3eb0e4c11ab308c643d804a5fad60892d3e16f17e05de146b7dd3a28053f WatchSource:0}: Error finding container f3fe3eb0e4c11ab308c643d804a5fad60892d3e16f17e05de146b7dd3a28053f: Status 404 returned error can't find the container with id f3fe3eb0e4c11ab308c643d804a5fad60892d3e16f17e05de146b7dd3a28053f Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.321041 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wbtsz"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.321998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9q5v\" (UniqueName: \"kubernetes.io/projected/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-kube-api-access-n9q5v\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.326811 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" event={"ID":"f5476114-774c-414f-9d4b-92617beae345","Type":"ContainerDied","Data":"4947f92317d684340605efa541f17a7c79ddb42112a1776060d1348a025f7179"} Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.326844 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" podUID="ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" containerName="dnsmasq-dns" containerID="cri-o://78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0" gracePeriod=10 Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.326961 4707 scope.go:117] "RemoveContainer" containerID="a1d467c14530202c27f0293155c3a995c7f2a6a0d58448a4c468aaef84b550c6" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.327139 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-z2kk8" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.341655 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.382188 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.390722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-config-data\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.390774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-log-httpd\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.390833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.390864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-scripts\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.390884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8p55\" (UniqueName: \"kubernetes.io/projected/fef3f370-7e00-4c47-be4a-23a8919e0c89-kube-api-access-c8p55\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.390915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.390951 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-run-httpd\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.391635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-run-httpd\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.394913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-log-httpd\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.401742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.404789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-scripts\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.408185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-config-data\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.412944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8p55\" (UniqueName: \"kubernetes.io/projected/fef3f370-7e00-4c47-be4a-23a8919e0c89-kube-api-access-c8p55\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.429703 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.605660 4707 scope.go:117] "RemoveContainer" containerID="a1d2aa80360a12cdeff97551fb6816161a8b682cde4ea5793a64436deedb4c2d" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.612081 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.615156 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.633198 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-z2kk8"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.655043 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-z2kk8"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.849628 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gzpxq"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.864486 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9gbdt"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.872113 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nqp7c"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.943729 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gjzwn"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.951560 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:45:56 crc kubenswrapper[4707]: I0129 03:45:56.960010 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-2hmmz"] Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.018005 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.086947 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.113912 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-svc\") pod \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.113978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-swift-storage-0\") pod \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.114032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t97gg\" (UniqueName: \"kubernetes.io/projected/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-kube-api-access-t97gg\") pod \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.114057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-sb\") pod \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.114104 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-config\") pod \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.114130 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-nb\") pod \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\" (UID: \"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74\") " Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.128904 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-kube-api-access-t97gg" (OuterVolumeSpecName: "kube-api-access-t97gg") pod "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" (UID: "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74"). InnerVolumeSpecName "kube-api-access-t97gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.243127 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t97gg\" (UniqueName: \"kubernetes.io/projected/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-kube-api-access-t97gg\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.293479 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5476114-774c-414f-9d4b-92617beae345" path="/var/lib/kubelet/pods/f5476114-774c-414f-9d4b-92617beae345/volumes" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.363517 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-config" (OuterVolumeSpecName: "config") pod "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" (UID: "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.365426 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" (UID: "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.373163 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" (UID: "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.386413 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.388833 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.388935 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.404597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" (UID: "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.411107 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" (UID: "ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.456788 4707 generic.go:334] "Generic (PLEG): container finished" podID="ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" containerID="78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0" exitCode=0 Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.456927 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.469238 4707 generic.go:334] "Generic (PLEG): container finished" podID="70d1fe5e-caa0-4cb3-bf06-18756581018a" containerID="a02e8b13f603d91f2e5acfd5b21468ac57232be7707410e58441920d5fd76522" exitCode=0 Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.490578 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.490611 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.490989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9gbdt" event={"ID":"c3fb185b-2bb2-4cc2-8572-b38db5027edb","Type":"ContainerStarted","Data":"582dd21d9e73b68736fa78e4bb97ee66ea66dd6a1fa27c3ad035d6f975397acf"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.491094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" event={"ID":"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74","Type":"ContainerDied","Data":"78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.491163 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dm5xt"] Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.491184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-4m8p6" event={"ID":"ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74","Type":"ContainerDied","Data":"0188244b4bb5de10df184679a9516602a8f1a3dc97559b6443a0e351971f02d0"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.491239 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gjzwn" event={"ID":"3a2cf721-9a8b-49ab-9e57-1337f407db4f","Type":"ContainerStarted","Data":"2caa257e8152804acc902d3ea3ca145fb6d03484778d9d9193b38869740a7af6"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.491263 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" event={"ID":"70d1fe5e-caa0-4cb3-bf06-18756581018a","Type":"ContainerDied","Data":"a02e8b13f603d91f2e5acfd5b21468ac57232be7707410e58441920d5fd76522"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.491324 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dm5xt"] Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.491348 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.491431 4707 scope.go:117] "RemoveContainer" containerID="78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.493014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" event={"ID":"c6de908f-0466-47f1-9a1f-bd9a306e98ac","Type":"ContainerStarted","Data":"b097362ce1d78d96c4ab613b9046086dc51530f99fd092f631953e07a785ef91"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.499980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gzpxq" event={"ID":"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a","Type":"ContainerStarted","Data":"f86bd62599e9bffdf6a60b1d3fbe5c8b28df051eab3a359c37f71e7e99a18546"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.505093 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.578890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kh5gh" event={"ID":"08bfd852-d6e3-4e65-845b-f2489a6e0fb1","Type":"ContainerStarted","Data":"8df1c720e0202184135b4823a79e815fd92bc89c379158fee4f43c11bfaeea8b"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.592862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wbtsz" event={"ID":"6808d614-6634-4b2a-9e78-7480a0921415","Type":"ContainerStarted","Data":"f3fe3eb0e4c11ab308c643d804a5fad60892d3e16f17e05de146b7dd3a28053f"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.634191 4707 scope.go:117] "RemoveContainer" containerID="56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.646864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2hmmz" event={"ID":"42399f3f-f8c5-45dc-b192-b3a8997c4636","Type":"ContainerStarted","Data":"1d06d2195fbb66f192512969a3a52b58654706a6c2cd889f661d72264a0984e7"} Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.751175 4707 scope.go:117] "RemoveContainer" containerID="78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0" Jan 29 03:45:57 crc kubenswrapper[4707]: E0129 03:45:57.755281 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0\": container with ID starting with 78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0 not found: ID does not exist" containerID="78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.755314 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0"} err="failed to get container status \"78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0\": rpc error: code = NotFound desc = could not find container \"78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0\": container with ID starting with 78f480955dcb3fa63e66ba8a214c8a6139fac229ca5b31f29cdf816f75b83cb0 not found: ID does not exist" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.755340 4707 scope.go:117] "RemoveContainer" containerID="56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2" Jan 29 03:45:57 crc kubenswrapper[4707]: E0129 03:45:57.757265 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2\": container with ID starting with 56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2 not found: ID does not exist" containerID="56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.757284 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2"} err="failed to get container status \"56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2\": rpc error: code = NotFound desc = could not find container \"56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2\": container with ID starting with 56b9c8e7d8dd10614b795070bf3d5e15add9af3f98c6e11c72386e91b3337ad2 not found: ID does not exist" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.772551 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-4m8p6"] Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.796032 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-4m8p6"] Jan 29 03:45:57 crc kubenswrapper[4707]: W0129 03:45:57.807000 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfef3f370_7e00_4c47_be4a_23a8919e0c89.slice/crio-93111b29842fc06a3a387200a23c87704f034a5a4cc6ab30d960b0da11915186 WatchSource:0}: Error finding container 93111b29842fc06a3a387200a23c87704f034a5a4cc6ab30d960b0da11915186: Status 404 returned error can't find the container with id 93111b29842fc06a3a387200a23c87704f034a5a4cc6ab30d960b0da11915186 Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.814725 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.818966 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kh5gh" podStartSLOduration=3.818904092 podStartE2EDuration="3.818904092s" podCreationTimestamp="2026-01-29 03:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:57.677802881 +0000 UTC m=+1111.162031786" watchObservedRunningTime="2026-01-29 03:45:57.818904092 +0000 UTC m=+1111.303132987" Jan 29 03:45:57 crc kubenswrapper[4707]: I0129 03:45:57.848975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.038516 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.109143 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-config\") pod \"70d1fe5e-caa0-4cb3-bf06-18756581018a\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.109371 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw8b4\" (UniqueName: \"kubernetes.io/projected/70d1fe5e-caa0-4cb3-bf06-18756581018a-kube-api-access-hw8b4\") pod \"70d1fe5e-caa0-4cb3-bf06-18756581018a\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.109462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-swift-storage-0\") pod \"70d1fe5e-caa0-4cb3-bf06-18756581018a\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.109523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-nb\") pod \"70d1fe5e-caa0-4cb3-bf06-18756581018a\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.109690 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-svc\") pod \"70d1fe5e-caa0-4cb3-bf06-18756581018a\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.109796 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-sb\") pod \"70d1fe5e-caa0-4cb3-bf06-18756581018a\" (UID: \"70d1fe5e-caa0-4cb3-bf06-18756581018a\") " Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.130664 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d1fe5e-caa0-4cb3-bf06-18756581018a-kube-api-access-hw8b4" (OuterVolumeSpecName: "kube-api-access-hw8b4") pod "70d1fe5e-caa0-4cb3-bf06-18756581018a" (UID: "70d1fe5e-caa0-4cb3-bf06-18756581018a"). InnerVolumeSpecName "kube-api-access-hw8b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.141429 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-config" (OuterVolumeSpecName: "config") pod "70d1fe5e-caa0-4cb3-bf06-18756581018a" (UID: "70d1fe5e-caa0-4cb3-bf06-18756581018a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.168589 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "70d1fe5e-caa0-4cb3-bf06-18756581018a" (UID: "70d1fe5e-caa0-4cb3-bf06-18756581018a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.177016 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "70d1fe5e-caa0-4cb3-bf06-18756581018a" (UID: "70d1fe5e-caa0-4cb3-bf06-18756581018a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.177129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "70d1fe5e-caa0-4cb3-bf06-18756581018a" (UID: "70d1fe5e-caa0-4cb3-bf06-18756581018a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.181349 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70d1fe5e-caa0-4cb3-bf06-18756581018a" (UID: "70d1fe5e-caa0-4cb3-bf06-18756581018a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.214116 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.214167 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.214182 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw8b4\" (UniqueName: \"kubernetes.io/projected/70d1fe5e-caa0-4cb3-bf06-18756581018a-kube-api-access-hw8b4\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.214200 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.214212 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.214222 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d1fe5e-caa0-4cb3-bf06-18756581018a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.704506 4707 generic.go:334] "Generic (PLEG): container finished" podID="c6de908f-0466-47f1-9a1f-bd9a306e98ac" containerID="4cba5a88b930ff177d7f3114c2e2b8e6ab0e091402f14d9b24d35fd3f296a837" exitCode=0 Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.704658 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" event={"ID":"c6de908f-0466-47f1-9a1f-bd9a306e98ac","Type":"ContainerDied","Data":"4cba5a88b930ff177d7f3114c2e2b8e6ab0e091402f14d9b24d35fd3f296a837"} Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.707577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f","Type":"ContainerStarted","Data":"e5f44b579e3776e22cef6bf5d22252567ee941572c9ced64a79ea5a6435bbbf9"} Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.730635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fef3f370-7e00-4c47-be4a-23a8919e0c89","Type":"ContainerStarted","Data":"93111b29842fc06a3a387200a23c87704f034a5a4cc6ab30d960b0da11915186"} Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.773287 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.773287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-4fcwr" event={"ID":"70d1fe5e-caa0-4cb3-bf06-18756581018a","Type":"ContainerDied","Data":"96acd47bd370e4ae4a9e215478202023e0870381500602bfc6896a26d4572ffd"} Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.773463 4707 scope.go:117] "RemoveContainer" containerID="a02e8b13f603d91f2e5acfd5b21468ac57232be7707410e58441920d5fd76522" Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.781764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2hmmz" event={"ID":"42399f3f-f8c5-45dc-b192-b3a8997c4636","Type":"ContainerStarted","Data":"d94c998a7d7df8bac0a22de419445fed83c8d7899f59d768dc82fc00977ce505"} Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.800448 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ab1334d-c75f-418c-b493-e8ce0ce95bef","Type":"ContainerStarted","Data":"7e8a11d7b5de8e2bc34d7bf6e95195ed9db206704a0f2435fa50b40ccb4cfab9"} Jan 29 03:45:58 crc kubenswrapper[4707]: I0129 03:45:58.806927 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-2hmmz" podStartSLOduration=4.806905788 podStartE2EDuration="4.806905788s" podCreationTimestamp="2026-01-29 03:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:58.800615628 +0000 UTC m=+1112.284844523" watchObservedRunningTime="2026-01-29 03:45:58.806905788 +0000 UTC m=+1112.291134683" Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.109532 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4fcwr"] Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.117834 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-4fcwr"] Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.260223 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d16a39-52da-4b3a-bf17-5e18ddbdf13a" path="/var/lib/kubelet/pods/51d16a39-52da-4b3a-bf17-5e18ddbdf13a/volumes" Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.261025 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70d1fe5e-caa0-4cb3-bf06-18756581018a" path="/var/lib/kubelet/pods/70d1fe5e-caa0-4cb3-bf06-18756581018a/volumes" Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.261827 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" path="/var/lib/kubelet/pods/ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74/volumes" Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.827101 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ab1334d-c75f-418c-b493-e8ce0ce95bef","Type":"ContainerStarted","Data":"e368866ea1aaa74994e984f56302ad79985763d5e596715969d33d8ec220ac39"} Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.830744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" event={"ID":"c6de908f-0466-47f1-9a1f-bd9a306e98ac","Type":"ContainerStarted","Data":"1d84ccd48450f7d7325c0a65191a8ce44d451628148d6e9ef16535f4d32b2bb1"} Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.831368 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.847910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f","Type":"ContainerStarted","Data":"a30b5d17914454e8574a8b8a6ea87619dc8c9583b112099aaf634d5f745e69fd"} Jan 29 03:45:59 crc kubenswrapper[4707]: I0129 03:45:59.852303 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" podStartSLOduration=5.852282508 podStartE2EDuration="5.852282508s" podCreationTimestamp="2026-01-29 03:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:45:59.848378537 +0000 UTC m=+1113.332607452" watchObservedRunningTime="2026-01-29 03:45:59.852282508 +0000 UTC m=+1113.336511413" Jan 29 03:46:00 crc kubenswrapper[4707]: I0129 03:46:00.871911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ab1334d-c75f-418c-b493-e8ce0ce95bef","Type":"ContainerStarted","Data":"3577cc90e3775ceb307b578a0146c6b6202bbb36ff7f955e561a9b90d5052b68"} Jan 29 03:46:00 crc kubenswrapper[4707]: I0129 03:46:00.872214 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerName="glance-log" containerID="cri-o://e368866ea1aaa74994e984f56302ad79985763d5e596715969d33d8ec220ac39" gracePeriod=30 Jan 29 03:46:00 crc kubenswrapper[4707]: I0129 03:46:00.872496 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerName="glance-httpd" containerID="cri-o://3577cc90e3775ceb307b578a0146c6b6202bbb36ff7f955e561a9b90d5052b68" gracePeriod=30 Jan 29 03:46:00 crc kubenswrapper[4707]: I0129 03:46:00.876412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f","Type":"ContainerStarted","Data":"a01554e2fade367a3e2f15769d535bf0bb8c678547729bc17c31b2e4154f2040"} Jan 29 03:46:00 crc kubenswrapper[4707]: I0129 03:46:00.876750 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerName="glance-log" containerID="cri-o://a30b5d17914454e8574a8b8a6ea87619dc8c9583b112099aaf634d5f745e69fd" gracePeriod=30 Jan 29 03:46:00 crc kubenswrapper[4707]: I0129 03:46:00.876774 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerName="glance-httpd" containerID="cri-o://a01554e2fade367a3e2f15769d535bf0bb8c678547729bc17c31b2e4154f2040" gracePeriod=30 Jan 29 03:46:00 crc kubenswrapper[4707]: I0129 03:46:00.901412 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.901385984 podStartE2EDuration="6.901385984s" podCreationTimestamp="2026-01-29 03:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:00.896455794 +0000 UTC m=+1114.380684709" watchObservedRunningTime="2026-01-29 03:46:00.901385984 +0000 UTC m=+1114.385614889" Jan 29 03:46:00 crc kubenswrapper[4707]: I0129 03:46:00.925682 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.925658616 podStartE2EDuration="6.925658616s" podCreationTimestamp="2026-01-29 03:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:00.923522685 +0000 UTC m=+1114.407751610" watchObservedRunningTime="2026-01-29 03:46:00.925658616 +0000 UTC m=+1114.409887521" Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.892444 4707 generic.go:334] "Generic (PLEG): container finished" podID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerID="3577cc90e3775ceb307b578a0146c6b6202bbb36ff7f955e561a9b90d5052b68" exitCode=0 Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.892483 4707 generic.go:334] "Generic (PLEG): container finished" podID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerID="e368866ea1aaa74994e984f56302ad79985763d5e596715969d33d8ec220ac39" exitCode=143 Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.892527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ab1334d-c75f-418c-b493-e8ce0ce95bef","Type":"ContainerDied","Data":"3577cc90e3775ceb307b578a0146c6b6202bbb36ff7f955e561a9b90d5052b68"} Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.892621 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ab1334d-c75f-418c-b493-e8ce0ce95bef","Type":"ContainerDied","Data":"e368866ea1aaa74994e984f56302ad79985763d5e596715969d33d8ec220ac39"} Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.895560 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerID="a01554e2fade367a3e2f15769d535bf0bb8c678547729bc17c31b2e4154f2040" exitCode=0 Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.895592 4707 generic.go:334] "Generic (PLEG): container finished" podID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerID="a30b5d17914454e8574a8b8a6ea87619dc8c9583b112099aaf634d5f745e69fd" exitCode=143 Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.895648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f","Type":"ContainerDied","Data":"a01554e2fade367a3e2f15769d535bf0bb8c678547729bc17c31b2e4154f2040"} Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.895712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f","Type":"ContainerDied","Data":"a30b5d17914454e8574a8b8a6ea87619dc8c9583b112099aaf634d5f745e69fd"} Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.897869 4707 generic.go:334] "Generic (PLEG): container finished" podID="08bfd852-d6e3-4e65-845b-f2489a6e0fb1" containerID="8df1c720e0202184135b4823a79e815fd92bc89c379158fee4f43c11bfaeea8b" exitCode=0 Jan 29 03:46:01 crc kubenswrapper[4707]: I0129 03:46:01.897915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kh5gh" event={"ID":"08bfd852-d6e3-4e65-845b-f2489a6e0fb1","Type":"ContainerDied","Data":"8df1c720e0202184135b4823a79e815fd92bc89c379158fee4f43c11bfaeea8b"} Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.187399 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5k88w"] Jan 29 03:46:02 crc kubenswrapper[4707]: E0129 03:46:02.187848 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" containerName="init" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.187879 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" containerName="init" Jan 29 03:46:02 crc kubenswrapper[4707]: E0129 03:46:02.187899 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" containerName="dnsmasq-dns" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.187905 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" containerName="dnsmasq-dns" Jan 29 03:46:02 crc kubenswrapper[4707]: E0129 03:46:02.187917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d1fe5e-caa0-4cb3-bf06-18756581018a" containerName="init" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.187924 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d1fe5e-caa0-4cb3-bf06-18756581018a" containerName="init" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.188105 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8cbbd2-8ffb-4a7e-b838-f86d415ebc74" containerName="dnsmasq-dns" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.188129 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="70d1fe5e-caa0-4cb3-bf06-18756581018a" containerName="init" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.188880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.191375 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.242312 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5k88w"] Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.284885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjqms\" (UniqueName: \"kubernetes.io/projected/ae136de8-5472-4668-9856-3b7b45942c99-kube-api-access-vjqms\") pod \"root-account-create-update-5k88w\" (UID: \"ae136de8-5472-4668-9856-3b7b45942c99\") " pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.285792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae136de8-5472-4668-9856-3b7b45942c99-operator-scripts\") pod \"root-account-create-update-5k88w\" (UID: \"ae136de8-5472-4668-9856-3b7b45942c99\") " pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.390652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae136de8-5472-4668-9856-3b7b45942c99-operator-scripts\") pod \"root-account-create-update-5k88w\" (UID: \"ae136de8-5472-4668-9856-3b7b45942c99\") " pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.390824 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjqms\" (UniqueName: \"kubernetes.io/projected/ae136de8-5472-4668-9856-3b7b45942c99-kube-api-access-vjqms\") pod \"root-account-create-update-5k88w\" (UID: \"ae136de8-5472-4668-9856-3b7b45942c99\") " pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.392098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae136de8-5472-4668-9856-3b7b45942c99-operator-scripts\") pod \"root-account-create-update-5k88w\" (UID: \"ae136de8-5472-4668-9856-3b7b45942c99\") " pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.421903 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjqms\" (UniqueName: \"kubernetes.io/projected/ae136de8-5472-4668-9856-3b7b45942c99-kube-api-access-vjqms\") pod \"root-account-create-update-5k88w\" (UID: \"ae136de8-5472-4668-9856-3b7b45942c99\") " pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:02 crc kubenswrapper[4707]: I0129 03:46:02.545740 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:03 crc kubenswrapper[4707]: I0129 03:46:03.462950 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:46:03 crc kubenswrapper[4707]: I0129 03:46:03.465164 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:46:05 crc kubenswrapper[4707]: I0129 03:46:05.491802 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:46:05 crc kubenswrapper[4707]: I0129 03:46:05.575467 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pxd9d"] Jan 29 03:46:05 crc kubenswrapper[4707]: I0129 03:46:05.575853 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-pxd9d" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="dnsmasq-dns" containerID="cri-o://f6fbfba7fedad77869794e5ddf408d7c9c846b223220661ce99cc66f0abf8153" gracePeriod=10 Jan 29 03:46:05 crc kubenswrapper[4707]: I0129 03:46:05.661263 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-pxd9d" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Jan 29 03:46:05 crc kubenswrapper[4707]: I0129 03:46:05.944346 4707 generic.go:334] "Generic (PLEG): container finished" podID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerID="f6fbfba7fedad77869794e5ddf408d7c9c846b223220661ce99cc66f0abf8153" exitCode=0 Jan 29 03:46:05 crc kubenswrapper[4707]: I0129 03:46:05.944406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pxd9d" event={"ID":"12874d9e-78e2-4430-a7e0-7f542dc518c0","Type":"ContainerDied","Data":"f6fbfba7fedad77869794e5ddf408d7c9c846b223220661ce99cc66f0abf8153"} Jan 29 03:46:10 crc kubenswrapper[4707]: I0129 03:46:10.660452 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-pxd9d" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Jan 29 03:46:12 crc kubenswrapper[4707]: E0129 03:46:12.841470 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Jan 29 03:46:12 crc kubenswrapper[4707]: E0129 03:46:12.842084 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67dh65bhf8h79h55fh9fhb8h65ch5c7h96hdh97h695h566h5f8h58fh577h5h567h666h5b7h577h96hcch5h595h64dhf7h88h56dh95h6dq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8p55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(fef3f370-7e00-4c47-be4a-23a8919e0c89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:46:15 crc kubenswrapper[4707]: E0129 03:46:15.004708 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Jan 29 03:46:15 crc kubenswrapper[4707]: E0129 03:46:15.005892 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fvznl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-9gbdt_openstack(c3fb185b-2bb2-4cc2-8572-b38db5027edb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:46:15 crc kubenswrapper[4707]: E0129 03:46:15.007826 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-9gbdt" podUID="c3fb185b-2bb2-4cc2-8572-b38db5027edb" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.079487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kh5gh" event={"ID":"08bfd852-d6e3-4e65-845b-f2489a6e0fb1","Type":"ContainerDied","Data":"e83ff0cf2103a1c0f5dc68fa3b4b185ca231867e8cb7ed822e3e0d469ffdce6e"} Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.079588 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e83ff0cf2103a1c0f5dc68fa3b4b185ca231867e8cb7ed822e3e0d469ffdce6e" Jan 29 03:46:15 crc kubenswrapper[4707]: E0129 03:46:15.089335 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-9gbdt" podUID="c3fb185b-2bb2-4cc2-8572-b38db5027edb" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.105751 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.193502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-config-data\") pod \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.194302 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-combined-ca-bundle\") pod \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.194366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-fernet-keys\") pod \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.194434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvkjj\" (UniqueName: \"kubernetes.io/projected/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-kube-api-access-gvkjj\") pod \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.194566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-scripts\") pod \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.194639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-credential-keys\") pod \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\" (UID: \"08bfd852-d6e3-4e65-845b-f2489a6e0fb1\") " Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.207597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-kube-api-access-gvkjj" (OuterVolumeSpecName: "kube-api-access-gvkjj") pod "08bfd852-d6e3-4e65-845b-f2489a6e0fb1" (UID: "08bfd852-d6e3-4e65-845b-f2489a6e0fb1"). InnerVolumeSpecName "kube-api-access-gvkjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.208672 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "08bfd852-d6e3-4e65-845b-f2489a6e0fb1" (UID: "08bfd852-d6e3-4e65-845b-f2489a6e0fb1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.212623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-scripts" (OuterVolumeSpecName: "scripts") pod "08bfd852-d6e3-4e65-845b-f2489a6e0fb1" (UID: "08bfd852-d6e3-4e65-845b-f2489a6e0fb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.222455 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "08bfd852-d6e3-4e65-845b-f2489a6e0fb1" (UID: "08bfd852-d6e3-4e65-845b-f2489a6e0fb1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.261730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08bfd852-d6e3-4e65-845b-f2489a6e0fb1" (UID: "08bfd852-d6e3-4e65-845b-f2489a6e0fb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.262002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-config-data" (OuterVolumeSpecName: "config-data") pod "08bfd852-d6e3-4e65-845b-f2489a6e0fb1" (UID: "08bfd852-d6e3-4e65-845b-f2489a6e0fb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.299500 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.299529 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.299555 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvkjj\" (UniqueName: \"kubernetes.io/projected/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-kube-api-access-gvkjj\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.299570 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.299583 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.299596 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08bfd852-d6e3-4e65-845b-f2489a6e0fb1-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.661343 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-pxd9d" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Jan 29 03:46:15 crc kubenswrapper[4707]: I0129 03:46:15.661518 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.087120 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kh5gh" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.212734 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kh5gh"] Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.222886 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kh5gh"] Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.326410 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8tm9r"] Jan 29 03:46:16 crc kubenswrapper[4707]: E0129 03:46:16.326925 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08bfd852-d6e3-4e65-845b-f2489a6e0fb1" containerName="keystone-bootstrap" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.326942 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="08bfd852-d6e3-4e65-845b-f2489a6e0fb1" containerName="keystone-bootstrap" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.327131 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="08bfd852-d6e3-4e65-845b-f2489a6e0fb1" containerName="keystone-bootstrap" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.327827 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.330026 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.330727 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.330785 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdlvf" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.331967 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.339196 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8tm9r"] Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.341053 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.421849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-scripts\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.421982 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-combined-ca-bundle\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.422021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-credential-keys\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.422288 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr25g\" (UniqueName: \"kubernetes.io/projected/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-kube-api-access-jr25g\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.422668 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-fernet-keys\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.422784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-config-data\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.524972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-scripts\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.525665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-combined-ca-bundle\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.526294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-credential-keys\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.526370 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr25g\" (UniqueName: \"kubernetes.io/projected/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-kube-api-access-jr25g\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.526447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-fernet-keys\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.526501 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-config-data\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.529387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-scripts\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.530524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-credential-keys\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.530814 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-combined-ca-bundle\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.532435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-config-data\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.545225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-fernet-keys\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.545956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr25g\" (UniqueName: \"kubernetes.io/projected/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-kube-api-access-jr25g\") pod \"keystone-bootstrap-8tm9r\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:16 crc kubenswrapper[4707]: I0129 03:46:16.646272 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:17 crc kubenswrapper[4707]: I0129 03:46:17.098180 4707 generic.go:334] "Generic (PLEG): container finished" podID="42399f3f-f8c5-45dc-b192-b3a8997c4636" containerID="d94c998a7d7df8bac0a22de419445fed83c8d7899f59d768dc82fc00977ce505" exitCode=0 Jan 29 03:46:17 crc kubenswrapper[4707]: I0129 03:46:17.098240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2hmmz" event={"ID":"42399f3f-f8c5-45dc-b192-b3a8997c4636","Type":"ContainerDied","Data":"d94c998a7d7df8bac0a22de419445fed83c8d7899f59d768dc82fc00977ce505"} Jan 29 03:46:17 crc kubenswrapper[4707]: I0129 03:46:17.257235 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08bfd852-d6e3-4e65-845b-f2489a6e0fb1" path="/var/lib/kubelet/pods/08bfd852-d6e3-4e65-845b-f2489a6e0fb1/volumes" Jan 29 03:46:20 crc kubenswrapper[4707]: I0129 03:46:20.660901 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-pxd9d" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Jan 29 03:46:23 crc kubenswrapper[4707]: E0129 03:46:23.794883 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Jan 29 03:46:23 crc kubenswrapper[4707]: E0129 03:46:23.795527 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8qrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-wbtsz_openstack(6808d614-6634-4b2a-9e78-7480a0921415): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:46:23 crc kubenswrapper[4707]: E0129 03:46:23.797949 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-wbtsz" podUID="6808d614-6634-4b2a-9e78-7480a0921415" Jan 29 03:46:23 crc kubenswrapper[4707]: I0129 03:46:23.884980 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:23 crc kubenswrapper[4707]: I0129 03:46:23.894660 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.009402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.010183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-public-tls-certs\") pod \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.010255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-internal-tls-certs\") pod \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.010366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-combined-ca-bundle\") pod \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.010491 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-scripts\") pod \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.010645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-combined-ca-bundle\") pod \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.010725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-config-data\") pod \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.010831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-scripts\") pod \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.010908 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-logs\") pod \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.010969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tljgv\" (UniqueName: \"kubernetes.io/projected/2ab1334d-c75f-418c-b493-e8ce0ce95bef-kube-api-access-tljgv\") pod \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.011124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.011218 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-logs\") pod \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.011302 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-httpd-run\") pod \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.011401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-httpd-run\") pod \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.011504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-config-data\") pod \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\" (UID: \"2ab1334d-c75f-418c-b493-e8ce0ce95bef\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.011649 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9q5v\" (UniqueName: \"kubernetes.io/projected/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-kube-api-access-n9q5v\") pod \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\" (UID: \"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f\") " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.011756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-logs" (OuterVolumeSpecName: "logs") pod "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" (UID: "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.011900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-logs" (OuterVolumeSpecName: "logs") pod "2ab1334d-c75f-418c-b493-e8ce0ce95bef" (UID: "2ab1334d-c75f-418c-b493-e8ce0ce95bef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.012050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" (UID: "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.012285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2ab1334d-c75f-418c-b493-e8ce0ce95bef" (UID: "2ab1334d-c75f-418c-b493-e8ce0ce95bef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.012868 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.012932 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.012960 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ab1334d-c75f-418c-b493-e8ce0ce95bef-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.012990 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.017361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "2ab1334d-c75f-418c-b493-e8ce0ce95bef" (UID: "2ab1334d-c75f-418c-b493-e8ce0ce95bef"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.018136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab1334d-c75f-418c-b493-e8ce0ce95bef-kube-api-access-tljgv" (OuterVolumeSpecName: "kube-api-access-tljgv") pod "2ab1334d-c75f-418c-b493-e8ce0ce95bef" (UID: "2ab1334d-c75f-418c-b493-e8ce0ce95bef"). InnerVolumeSpecName "kube-api-access-tljgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.018665 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-kube-api-access-n9q5v" (OuterVolumeSpecName: "kube-api-access-n9q5v") pod "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" (UID: "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f"). InnerVolumeSpecName "kube-api-access-n9q5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.018961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" (UID: "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.020226 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-scripts" (OuterVolumeSpecName: "scripts") pod "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" (UID: "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.022237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-scripts" (OuterVolumeSpecName: "scripts") pod "2ab1334d-c75f-418c-b493-e8ce0ce95bef" (UID: "2ab1334d-c75f-418c-b493-e8ce0ce95bef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.056322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ab1334d-c75f-418c-b493-e8ce0ce95bef" (UID: "2ab1334d-c75f-418c-b493-e8ce0ce95bef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.074683 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2ab1334d-c75f-418c-b493-e8ce0ce95bef" (UID: "2ab1334d-c75f-418c-b493-e8ce0ce95bef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.075352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-config-data" (OuterVolumeSpecName: "config-data") pod "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" (UID: "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.075843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" (UID: "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.077686 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" (UID: "bd2eb1c9-ebb6-43c5-b32a-768769d63d0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.078743 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-config-data" (OuterVolumeSpecName: "config-data") pod "2ab1334d-c75f-418c-b493-e8ce0ce95bef" (UID: "2ab1334d-c75f-418c-b493-e8ce0ce95bef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114464 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tljgv\" (UniqueName: \"kubernetes.io/projected/2ab1334d-c75f-418c-b493-e8ce0ce95bef-kube-api-access-tljgv\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114529 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114553 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114567 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9q5v\" (UniqueName: \"kubernetes.io/projected/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-kube-api-access-n9q5v\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114584 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114593 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114602 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114611 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114621 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114630 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114638 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.114648 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab1334d-c75f-418c-b493-e8ce0ce95bef-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.132533 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.132956 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.164155 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.164183 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ab1334d-c75f-418c-b493-e8ce0ce95bef","Type":"ContainerDied","Data":"7e8a11d7b5de8e2bc34d7bf6e95195ed9db206704a0f2435fa50b40ccb4cfab9"} Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.164313 4707 scope.go:117] "RemoveContainer" containerID="3577cc90e3775ceb307b578a0146c6b6202bbb36ff7f955e561a9b90d5052b68" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.169158 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.169341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd2eb1c9-ebb6-43c5-b32a-768769d63d0f","Type":"ContainerDied","Data":"e5f44b579e3776e22cef6bf5d22252567ee941572c9ced64a79ea5a6435bbbf9"} Jan 29 03:46:24 crc kubenswrapper[4707]: E0129 03:46:24.171206 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-wbtsz" podUID="6808d614-6634-4b2a-9e78-7480a0921415" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.216500 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.216560 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.224041 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.241670 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.255691 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.284614 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.301654 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:46:24 crc kubenswrapper[4707]: E0129 03:46:24.302190 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerName="glance-httpd" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.302211 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerName="glance-httpd" Jan 29 03:46:24 crc kubenswrapper[4707]: E0129 03:46:24.302243 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerName="glance-log" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.302251 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerName="glance-log" Jan 29 03:46:24 crc kubenswrapper[4707]: E0129 03:46:24.302267 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerName="glance-log" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.302275 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerName="glance-log" Jan 29 03:46:24 crc kubenswrapper[4707]: E0129 03:46:24.302288 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerName="glance-httpd" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.302295 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerName="glance-httpd" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.302452 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerName="glance-httpd" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.302468 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" containerName="glance-log" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.302478 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerName="glance-httpd" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.302489 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" containerName="glance-log" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.303529 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.305950 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tqq24" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.306185 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.306288 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.306376 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.319297 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.333459 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.335724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.337778 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.339597 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.342293 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.422928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.423010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.423049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.423083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-logs\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.423121 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mtm\" (UniqueName: \"kubernetes.io/projected/f0871cd7-6629-480c-801d-73c00a747882-kube-api-access-59mtm\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.423153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.423233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.423267 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.524791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.524845 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.524877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.524910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-logs\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.524966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59mtm\" (UniqueName: \"kubernetes.io/projected/f0871cd7-6629-480c-801d-73c00a747882-kube-api-access-59mtm\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.525124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.525390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.525484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.525515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-logs\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.525569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.525636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.525693 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-logs\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.525776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr5td\" (UniqueName: \"kubernetes.io/projected/434a08d3-ec01-45a9-9b61-ceb740c82fa0-kube-api-access-tr5td\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.525857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.526045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.526199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.526310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.526334 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.526756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.542790 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.544158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.550305 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mtm\" (UniqueName: \"kubernetes.io/projected/f0871cd7-6629-480c-801d-73c00a747882-kube-api-access-59mtm\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.558433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.561206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.563286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.627800 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.627858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.627882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-logs\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.627944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr5td\" (UniqueName: \"kubernetes.io/projected/434a08d3-ec01-45a9-9b61-ceb740c82fa0-kube-api-access-tr5td\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.629070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-logs\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.629146 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.629416 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.629486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.629827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.629867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.630086 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.634553 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.634979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.634978 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.635131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.638024 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.647843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr5td\" (UniqueName: \"kubernetes.io/projected/434a08d3-ec01-45a9-9b61-ceb740c82fa0-kube-api-access-tr5td\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.666319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:46:24 crc kubenswrapper[4707]: I0129 03:46:24.701054 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.258681 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab1334d-c75f-418c-b493-e8ce0ce95bef" path="/var/lib/kubelet/pods/2ab1334d-c75f-418c-b493-e8ce0ce95bef/volumes" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.259463 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2eb1c9-ebb6-43c5-b32a-768769d63d0f" path="/var/lib/kubelet/pods/bd2eb1c9-ebb6-43c5-b32a-768769d63d0f/volumes" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.266526 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.280040 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:46:25 crc kubenswrapper[4707]: E0129 03:46:25.318440 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 29 03:46:25 crc kubenswrapper[4707]: E0129 03:46:25.318615 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kx4fg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-gzpxq_openstack(e6b80fc8-a8ca-417d-9f86-d4fb86587f3a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 03:46:25 crc kubenswrapper[4707]: E0129 03:46:25.320182 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-gzpxq" podUID="e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.341650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-nb\") pod \"12874d9e-78e2-4430-a7e0-7f542dc518c0\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.341804 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-config\") pod \"12874d9e-78e2-4430-a7e0-7f542dc518c0\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.341835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-sb\") pod \"12874d9e-78e2-4430-a7e0-7f542dc518c0\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.341932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-dns-svc\") pod \"12874d9e-78e2-4430-a7e0-7f542dc518c0\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.342010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bhlv\" (UniqueName: \"kubernetes.io/projected/12874d9e-78e2-4430-a7e0-7f542dc518c0-kube-api-access-4bhlv\") pod \"12874d9e-78e2-4430-a7e0-7f542dc518c0\" (UID: \"12874d9e-78e2-4430-a7e0-7f542dc518c0\") " Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.348804 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12874d9e-78e2-4430-a7e0-7f542dc518c0-kube-api-access-4bhlv" (OuterVolumeSpecName: "kube-api-access-4bhlv") pod "12874d9e-78e2-4430-a7e0-7f542dc518c0" (UID: "12874d9e-78e2-4430-a7e0-7f542dc518c0"). InnerVolumeSpecName "kube-api-access-4bhlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.391863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "12874d9e-78e2-4430-a7e0-7f542dc518c0" (UID: "12874d9e-78e2-4430-a7e0-7f542dc518c0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.399779 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-config" (OuterVolumeSpecName: "config") pod "12874d9e-78e2-4430-a7e0-7f542dc518c0" (UID: "12874d9e-78e2-4430-a7e0-7f542dc518c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.414918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "12874d9e-78e2-4430-a7e0-7f542dc518c0" (UID: "12874d9e-78e2-4430-a7e0-7f542dc518c0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.417191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12874d9e-78e2-4430-a7e0-7f542dc518c0" (UID: "12874d9e-78e2-4430-a7e0-7f542dc518c0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.447423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfmrc\" (UniqueName: \"kubernetes.io/projected/42399f3f-f8c5-45dc-b192-b3a8997c4636-kube-api-access-kfmrc\") pod \"42399f3f-f8c5-45dc-b192-b3a8997c4636\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.448045 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-combined-ca-bundle\") pod \"42399f3f-f8c5-45dc-b192-b3a8997c4636\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.448087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-config\") pod \"42399f3f-f8c5-45dc-b192-b3a8997c4636\" (UID: \"42399f3f-f8c5-45dc-b192-b3a8997c4636\") " Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.448549 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.448566 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.448576 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.448585 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12874d9e-78e2-4430-a7e0-7f542dc518c0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.448594 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bhlv\" (UniqueName: \"kubernetes.io/projected/12874d9e-78e2-4430-a7e0-7f542dc518c0-kube-api-access-4bhlv\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.451306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42399f3f-f8c5-45dc-b192-b3a8997c4636-kube-api-access-kfmrc" (OuterVolumeSpecName: "kube-api-access-kfmrc") pod "42399f3f-f8c5-45dc-b192-b3a8997c4636" (UID: "42399f3f-f8c5-45dc-b192-b3a8997c4636"). InnerVolumeSpecName "kube-api-access-kfmrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.470420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42399f3f-f8c5-45dc-b192-b3a8997c4636" (UID: "42399f3f-f8c5-45dc-b192-b3a8997c4636"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.471195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-config" (OuterVolumeSpecName: "config") pod "42399f3f-f8c5-45dc-b192-b3a8997c4636" (UID: "42399f3f-f8c5-45dc-b192-b3a8997c4636"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.551490 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfmrc\" (UniqueName: \"kubernetes.io/projected/42399f3f-f8c5-45dc-b192-b3a8997c4636-kube-api-access-kfmrc\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.551553 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.551567 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/42399f3f-f8c5-45dc-b192-b3a8997c4636-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.787823 4707 scope.go:117] "RemoveContainer" containerID="e368866ea1aaa74994e984f56302ad79985763d5e596715969d33d8ec220ac39" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.830743 4707 scope.go:117] "RemoveContainer" containerID="a01554e2fade367a3e2f15769d535bf0bb8c678547729bc17c31b2e4154f2040" Jan 29 03:46:25 crc kubenswrapper[4707]: I0129 03:46:25.855408 4707 scope.go:117] "RemoveContainer" containerID="a30b5d17914454e8574a8b8a6ea87619dc8c9583b112099aaf634d5f745e69fd" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.204441 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pxd9d" event={"ID":"12874d9e-78e2-4430-a7e0-7f542dc518c0","Type":"ContainerDied","Data":"464a80d47fe8cf6d4c11a0eaf013c85f8604c3f0bad8db13705ece09a6669fc0"} Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.204513 4707 scope.go:117] "RemoveContainer" containerID="f6fbfba7fedad77869794e5ddf408d7c9c846b223220661ce99cc66f0abf8153" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.204654 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pxd9d" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.218725 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gjzwn" event={"ID":"3a2cf721-9a8b-49ab-9e57-1337f407db4f","Type":"ContainerStarted","Data":"9e15fd15b9d508d8e4354452e5ca76ed5444a58e8db9397d7377afccee8c451f"} Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.221631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fef3f370-7e00-4c47-be4a-23a8919e0c89","Type":"ContainerStarted","Data":"6b15895a18792bdda14c470f84a8e95e01aa65e1e0c4f766264206f64e735a5b"} Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.226439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2hmmz" event={"ID":"42399f3f-f8c5-45dc-b192-b3a8997c4636","Type":"ContainerDied","Data":"1d06d2195fbb66f192512969a3a52b58654706a6c2cd889f661d72264a0984e7"} Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.226524 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d06d2195fbb66f192512969a3a52b58654706a6c2cd889f661d72264a0984e7" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.226469 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2hmmz" Jan 29 03:46:26 crc kubenswrapper[4707]: E0129 03:46:26.235138 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-gzpxq" podUID="e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.247039 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pxd9d"] Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.251348 4707 scope.go:117] "RemoveContainer" containerID="6ca1552f4071c6d249ac4cefc6015a26e36b74ca4ea609f9f11eb9e76eafb1f3" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.261204 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pxd9d"] Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.275317 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5k88w"] Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.293012 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gjzwn" podStartSLOduration=4.009614566 podStartE2EDuration="32.292982648s" podCreationTimestamp="2026-01-29 03:45:54 +0000 UTC" firstStartedPulling="2026-01-29 03:45:56.994698354 +0000 UTC m=+1110.478927259" lastFinishedPulling="2026-01-29 03:46:25.278066436 +0000 UTC m=+1138.762295341" observedRunningTime="2026-01-29 03:46:26.257174428 +0000 UTC m=+1139.741403343" watchObservedRunningTime="2026-01-29 03:46:26.292982648 +0000 UTC m=+1139.777211553" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.364705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8tm9r"] Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.508078 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.608780 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ggj2r"] Jan 29 03:46:26 crc kubenswrapper[4707]: E0129 03:46:26.615081 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="dnsmasq-dns" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.615121 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="dnsmasq-dns" Jan 29 03:46:26 crc kubenswrapper[4707]: E0129 03:46:26.615144 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="init" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.615150 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="init" Jan 29 03:46:26 crc kubenswrapper[4707]: E0129 03:46:26.615188 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42399f3f-f8c5-45dc-b192-b3a8997c4636" containerName="neutron-db-sync" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.615195 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42399f3f-f8c5-45dc-b192-b3a8997c4636" containerName="neutron-db-sync" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.615478 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" containerName="dnsmasq-dns" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.615495 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="42399f3f-f8c5-45dc-b192-b3a8997c4636" containerName="neutron-db-sync" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.627477 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.631553 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f44bf9c6d-746dh"] Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.633454 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.639230 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.639431 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.640861 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8vm9d" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.647412 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.647605 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ggj2r"] Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.676200 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f44bf9c6d-746dh"] Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.795724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmbc6\" (UniqueName: \"kubernetes.io/projected/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-kube-api-access-pmbc6\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.795786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-httpd-config\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.795841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-combined-ca-bundle\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.795870 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.795886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.795910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-config\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.795952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.795986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-ovndb-tls-certs\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.796002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.796021 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-config\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.796055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44c98\" (UniqueName: \"kubernetes.io/projected/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-kube-api-access-44c98\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.905086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.905157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.905209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-config\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.905461 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.905508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-ovndb-tls-certs\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.905550 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.907054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-config\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.907128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44c98\" (UniqueName: \"kubernetes.io/projected/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-kube-api-access-44c98\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.907187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmbc6\" (UniqueName: \"kubernetes.io/projected/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-kube-api-access-pmbc6\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.907207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-httpd-config\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.907214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.907253 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-combined-ca-bundle\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.908046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-config\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.908569 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.908854 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.909485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.911905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-combined-ca-bundle\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.915604 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-config\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.917270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-httpd-config\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.918261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-ovndb-tls-certs\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.930862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmbc6\" (UniqueName: \"kubernetes.io/projected/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-kube-api-access-pmbc6\") pod \"neutron-7f44bf9c6d-746dh\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.941904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44c98\" (UniqueName: \"kubernetes.io/projected/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-kube-api-access-44c98\") pod \"dnsmasq-dns-55f844cf75-ggj2r\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.972501 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:26 crc kubenswrapper[4707]: I0129 03:46:26.985545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.171650 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:46:27 crc kubenswrapper[4707]: W0129 03:46:27.205704 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0871cd7_6629_480c_801d_73c00a747882.slice/crio-246e10e7e0bacde856b7172c25d7915349b3c096ec8442acf8ed9f5af8ab4c5a WatchSource:0}: Error finding container 246e10e7e0bacde856b7172c25d7915349b3c096ec8442acf8ed9f5af8ab4c5a: Status 404 returned error can't find the container with id 246e10e7e0bacde856b7172c25d7915349b3c096ec8442acf8ed9f5af8ab4c5a Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.276236 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12874d9e-78e2-4430-a7e0-7f542dc518c0" path="/var/lib/kubelet/pods/12874d9e-78e2-4430-a7e0-7f542dc518c0/volumes" Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.279642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0871cd7-6629-480c-801d-73c00a747882","Type":"ContainerStarted","Data":"246e10e7e0bacde856b7172c25d7915349b3c096ec8442acf8ed9f5af8ab4c5a"} Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.294366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9gbdt" event={"ID":"c3fb185b-2bb2-4cc2-8572-b38db5027edb","Type":"ContainerStarted","Data":"096d748d230f8d85731c5abf7119a97caba232a8e5fa7ab9ecddcad3da9da62c"} Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.297965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"434a08d3-ec01-45a9-9b61-ceb740c82fa0","Type":"ContainerStarted","Data":"98641fc038f4d11ecd6840619726c5c723108d99ef8c4b1b7b896f1caf350b50"} Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.299678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8tm9r" event={"ID":"f326dfe8-582f-44e7-9030-8bbfbf4ccb68","Type":"ContainerStarted","Data":"d1002857d91bbb4e78112591a44e36f2bef70dfc80b0978c0dd542ac0bbf2e1e"} Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.299769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8tm9r" event={"ID":"f326dfe8-582f-44e7-9030-8bbfbf4ccb68","Type":"ContainerStarted","Data":"a07d3bdf9f2df62e8ec4fb4966475261904bcce9f8716ecd0bcb9974b8944069"} Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.306995 4707 generic.go:334] "Generic (PLEG): container finished" podID="ae136de8-5472-4668-9856-3b7b45942c99" containerID="03a9c90b6df5078686ed567815aaaa0464859b7022a274fbf4a95ee49428e46f" exitCode=0 Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.308045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5k88w" event={"ID":"ae136de8-5472-4668-9856-3b7b45942c99","Type":"ContainerDied","Data":"03a9c90b6df5078686ed567815aaaa0464859b7022a274fbf4a95ee49428e46f"} Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.308083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5k88w" event={"ID":"ae136de8-5472-4668-9856-3b7b45942c99","Type":"ContainerStarted","Data":"c3e82ee5b4e79848b077ca23507d5e274dbf398acc7c3bb62b0b6ba607e8b83c"} Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.445990 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-9gbdt" podStartSLOduration=3.488916538 podStartE2EDuration="33.445959326s" podCreationTimestamp="2026-01-29 03:45:54 +0000 UTC" firstStartedPulling="2026-01-29 03:45:56.850479964 +0000 UTC m=+1110.334708859" lastFinishedPulling="2026-01-29 03:46:26.807522742 +0000 UTC m=+1140.291751647" observedRunningTime="2026-01-29 03:46:27.394026796 +0000 UTC m=+1140.878255701" watchObservedRunningTime="2026-01-29 03:46:27.445959326 +0000 UTC m=+1140.930188231" Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.467869 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8tm9r" podStartSLOduration=11.467839569 podStartE2EDuration="11.467839569s" podCreationTimestamp="2026-01-29 03:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:27.435267891 +0000 UTC m=+1140.919496796" watchObservedRunningTime="2026-01-29 03:46:27.467839569 +0000 UTC m=+1140.952068494" Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.747744 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ggj2r"] Jan 29 03:46:27 crc kubenswrapper[4707]: W0129 03:46:27.752756 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9d44e5e_7bf6_4478_91bf_bd5c9b161b6e.slice/crio-43c1d2084a044fdf80c1ac1661cde18c885c245e1bd659e48a296f4def2ee420 WatchSource:0}: Error finding container 43c1d2084a044fdf80c1ac1661cde18c885c245e1bd659e48a296f4def2ee420: Status 404 returned error can't find the container with id 43c1d2084a044fdf80c1ac1661cde18c885c245e1bd659e48a296f4def2ee420 Jan 29 03:46:27 crc kubenswrapper[4707]: I0129 03:46:27.883167 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f44bf9c6d-746dh"] Jan 29 03:46:28 crc kubenswrapper[4707]: I0129 03:46:28.385329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f44bf9c6d-746dh" event={"ID":"e70752bb-f7b2-4cd4-ace7-b64b837a8e95","Type":"ContainerStarted","Data":"7d581f904e5abc065a51a67dc3a980c6ed1d27c9ce7b386c07500efa89132130"} Jan 29 03:46:28 crc kubenswrapper[4707]: I0129 03:46:28.397930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"434a08d3-ec01-45a9-9b61-ceb740c82fa0","Type":"ContainerStarted","Data":"55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d"} Jan 29 03:46:28 crc kubenswrapper[4707]: I0129 03:46:28.416456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0871cd7-6629-480c-801d-73c00a747882","Type":"ContainerStarted","Data":"9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b"} Jan 29 03:46:28 crc kubenswrapper[4707]: I0129 03:46:28.427317 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" containerID="88b999b2ef9b8b4ec97ebb9acbaf51674a8787c3470c3d43d40ef6323612fe81" exitCode=0 Jan 29 03:46:28 crc kubenswrapper[4707]: I0129 03:46:28.427766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" event={"ID":"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e","Type":"ContainerDied","Data":"88b999b2ef9b8b4ec97ebb9acbaf51674a8787c3470c3d43d40ef6323612fe81"} Jan 29 03:46:28 crc kubenswrapper[4707]: I0129 03:46:28.427810 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" event={"ID":"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e","Type":"ContainerStarted","Data":"43c1d2084a044fdf80c1ac1661cde18c885c245e1bd659e48a296f4def2ee420"} Jan 29 03:46:28 crc kubenswrapper[4707]: I0129 03:46:28.828818 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.017033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae136de8-5472-4668-9856-3b7b45942c99-operator-scripts\") pod \"ae136de8-5472-4668-9856-3b7b45942c99\" (UID: \"ae136de8-5472-4668-9856-3b7b45942c99\") " Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.017863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae136de8-5472-4668-9856-3b7b45942c99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae136de8-5472-4668-9856-3b7b45942c99" (UID: "ae136de8-5472-4668-9856-3b7b45942c99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.018412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjqms\" (UniqueName: \"kubernetes.io/projected/ae136de8-5472-4668-9856-3b7b45942c99-kube-api-access-vjqms\") pod \"ae136de8-5472-4668-9856-3b7b45942c99\" (UID: \"ae136de8-5472-4668-9856-3b7b45942c99\") " Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.019359 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae136de8-5472-4668-9856-3b7b45942c99-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.025598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae136de8-5472-4668-9856-3b7b45942c99-kube-api-access-vjqms" (OuterVolumeSpecName: "kube-api-access-vjqms") pod "ae136de8-5472-4668-9856-3b7b45942c99" (UID: "ae136de8-5472-4668-9856-3b7b45942c99"). InnerVolumeSpecName "kube-api-access-vjqms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.124903 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjqms\" (UniqueName: \"kubernetes.io/projected/ae136de8-5472-4668-9856-3b7b45942c99-kube-api-access-vjqms\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.230299 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6694d87b67-rdpz4"] Jan 29 03:46:29 crc kubenswrapper[4707]: E0129 03:46:29.231120 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae136de8-5472-4668-9856-3b7b45942c99" containerName="mariadb-account-create-update" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.231143 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae136de8-5472-4668-9856-3b7b45942c99" containerName="mariadb-account-create-update" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.231352 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae136de8-5472-4668-9856-3b7b45942c99" containerName="mariadb-account-create-update" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.232598 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.248225 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.248501 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.264013 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6694d87b67-rdpz4"] Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.330962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-internal-tls-certs\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.331052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsjlz\" (UniqueName: \"kubernetes.io/projected/2500de98-6ed6-4399-889c-a397807fcd52-kube-api-access-qsjlz\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.331083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-public-tls-certs\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.331137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-ovndb-tls-certs\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.331222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-config\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.331523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-httpd-config\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.331650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-combined-ca-bundle\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.433419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-httpd-config\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.433562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-combined-ca-bundle\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.433609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-internal-tls-certs\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.433643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsjlz\" (UniqueName: \"kubernetes.io/projected/2500de98-6ed6-4399-889c-a397807fcd52-kube-api-access-qsjlz\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.433672 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-public-tls-certs\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.433714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-ovndb-tls-certs\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.433754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-config\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.448110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-combined-ca-bundle\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.450947 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-internal-tls-certs\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.451800 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-config\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.453754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f44bf9c6d-746dh" event={"ID":"e70752bb-f7b2-4cd4-ace7-b64b837a8e95","Type":"ContainerStarted","Data":"9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd"} Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.453826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f44bf9c6d-746dh" event={"ID":"e70752bb-f7b2-4cd4-ace7-b64b837a8e95","Type":"ContainerStarted","Data":"479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f"} Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.455342 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.459788 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-httpd-config\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.460773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" event={"ID":"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e","Type":"ContainerStarted","Data":"3fd4b150cc3cce0225f9c529689fb7979f1c4d6d86140fee252a91e3a3dc83d1"} Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.461837 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.463369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-ovndb-tls-certs\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.464454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"434a08d3-ec01-45a9-9b61-ceb740c82fa0","Type":"ContainerStarted","Data":"ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e"} Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.466984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5k88w" event={"ID":"ae136de8-5472-4668-9856-3b7b45942c99","Type":"ContainerDied","Data":"c3e82ee5b4e79848b077ca23507d5e274dbf398acc7c3bb62b0b6ba607e8b83c"} Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.467011 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3e82ee5b4e79848b077ca23507d5e274dbf398acc7c3bb62b0b6ba607e8b83c" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.467059 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5k88w" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.469586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0871cd7-6629-480c-801d-73c00a747882","Type":"ContainerStarted","Data":"ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48"} Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.475823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-public-tls-certs\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.476113 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsjlz\" (UniqueName: \"kubernetes.io/projected/2500de98-6ed6-4399-889c-a397807fcd52-kube-api-access-qsjlz\") pod \"neutron-6694d87b67-rdpz4\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.513207 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f44bf9c6d-746dh" podStartSLOduration=3.513179675 podStartE2EDuration="3.513179675s" podCreationTimestamp="2026-01-29 03:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:29.489512611 +0000 UTC m=+1142.973741516" watchObservedRunningTime="2026-01-29 03:46:29.513179675 +0000 UTC m=+1142.997408580" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.543564 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.54352246 podStartE2EDuration="5.54352246s" podCreationTimestamp="2026-01-29 03:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:29.530695045 +0000 UTC m=+1143.014923950" watchObservedRunningTime="2026-01-29 03:46:29.54352246 +0000 UTC m=+1143.027751365" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.589894 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.627875 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.627851983 podStartE2EDuration="5.627851983s" podCreationTimestamp="2026-01-29 03:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:29.570936951 +0000 UTC m=+1143.055165856" watchObservedRunningTime="2026-01-29 03:46:29.627851983 +0000 UTC m=+1143.112080888" Jan 29 03:46:29 crc kubenswrapper[4707]: I0129 03:46:29.641995 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" podStartSLOduration=3.641972786 podStartE2EDuration="3.641972786s" podCreationTimestamp="2026-01-29 03:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:29.631003263 +0000 UTC m=+1143.115232168" watchObservedRunningTime="2026-01-29 03:46:29.641972786 +0000 UTC m=+1143.126201691" Jan 29 03:46:30 crc kubenswrapper[4707]: I0129 03:46:30.365035 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6694d87b67-rdpz4"] Jan 29 03:46:30 crc kubenswrapper[4707]: I0129 03:46:30.500308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6694d87b67-rdpz4" event={"ID":"2500de98-6ed6-4399-889c-a397807fcd52","Type":"ContainerStarted","Data":"fd20d2208d0a082f1870b295201ad1c8cd74894739b8430f27de978823c459b9"} Jan 29 03:46:31 crc kubenswrapper[4707]: I0129 03:46:31.511217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6694d87b67-rdpz4" event={"ID":"2500de98-6ed6-4399-889c-a397807fcd52","Type":"ContainerStarted","Data":"eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389"} Jan 29 03:46:31 crc kubenswrapper[4707]: I0129 03:46:31.513448 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3fb185b-2bb2-4cc2-8572-b38db5027edb" containerID="096d748d230f8d85731c5abf7119a97caba232a8e5fa7ab9ecddcad3da9da62c" exitCode=0 Jan 29 03:46:31 crc kubenswrapper[4707]: I0129 03:46:31.513574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9gbdt" event={"ID":"c3fb185b-2bb2-4cc2-8572-b38db5027edb","Type":"ContainerDied","Data":"096d748d230f8d85731c5abf7119a97caba232a8e5fa7ab9ecddcad3da9da62c"} Jan 29 03:46:31 crc kubenswrapper[4707]: I0129 03:46:31.515707 4707 generic.go:334] "Generic (PLEG): container finished" podID="3a2cf721-9a8b-49ab-9e57-1337f407db4f" containerID="9e15fd15b9d508d8e4354452e5ca76ed5444a58e8db9397d7377afccee8c451f" exitCode=0 Jan 29 03:46:31 crc kubenswrapper[4707]: I0129 03:46:31.515856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gjzwn" event={"ID":"3a2cf721-9a8b-49ab-9e57-1337f407db4f","Type":"ContainerDied","Data":"9e15fd15b9d508d8e4354452e5ca76ed5444a58e8db9397d7377afccee8c451f"} Jan 29 03:46:31 crc kubenswrapper[4707]: I0129 03:46:31.519065 4707 generic.go:334] "Generic (PLEG): container finished" podID="f326dfe8-582f-44e7-9030-8bbfbf4ccb68" containerID="d1002857d91bbb4e78112591a44e36f2bef70dfc80b0978c0dd542ac0bbf2e1e" exitCode=0 Jan 29 03:46:31 crc kubenswrapper[4707]: I0129 03:46:31.519123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8tm9r" event={"ID":"f326dfe8-582f-44e7-9030-8bbfbf4ccb68","Type":"ContainerDied","Data":"d1002857d91bbb4e78112591a44e36f2bef70dfc80b0978c0dd542ac0bbf2e1e"} Jan 29 03:46:33 crc kubenswrapper[4707]: I0129 03:46:33.463527 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:46:33 crc kubenswrapper[4707]: I0129 03:46:33.464008 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:46:33 crc kubenswrapper[4707]: I0129 03:46:33.464073 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:46:33 crc kubenswrapper[4707]: I0129 03:46:33.464734 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9348d06267b549d79524d7d6fb99695969175eb246c0104709c649f6ca1b571"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 03:46:33 crc kubenswrapper[4707]: I0129 03:46:33.464792 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://b9348d06267b549d79524d7d6fb99695969175eb246c0104709c649f6ca1b571" gracePeriod=600 Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.566964 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="b9348d06267b549d79524d7d6fb99695969175eb246c0104709c649f6ca1b571" exitCode=0 Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.567115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"b9348d06267b549d79524d7d6fb99695969175eb246c0104709c649f6ca1b571"} Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.567179 4707 scope.go:117] "RemoveContainer" containerID="db143d5776abf0f9e4af062dd3ffe22d1bdacd65eb8ea86d4728ea8e0ca0f327" Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.638860 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.638949 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.680987 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.702820 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.702909 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.708822 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.757841 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:34 crc kubenswrapper[4707]: I0129 03:46:34.763603 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.365465 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.367009 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.368479 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9gbdt" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.523734 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-scripts\") pod \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-combined-ca-bundle\") pod \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9r9b\" (UniqueName: \"kubernetes.io/projected/3a2cf721-9a8b-49ab-9e57-1337f407db4f-kube-api-access-m9r9b\") pod \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-credential-keys\") pod \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-fernet-keys\") pod \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524317 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fb185b-2bb2-4cc2-8572-b38db5027edb-logs\") pod \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524464 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-config-data\") pod \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-combined-ca-bundle\") pod \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-config-data\") pod \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-scripts\") pod \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvznl\" (UniqueName: \"kubernetes.io/projected/c3fb185b-2bb2-4cc2-8572-b38db5027edb-kube-api-access-fvznl\") pod \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\" (UID: \"c3fb185b-2bb2-4cc2-8572-b38db5027edb\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524811 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-combined-ca-bundle\") pod \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr25g\" (UniqueName: \"kubernetes.io/projected/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-kube-api-access-jr25g\") pod \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\" (UID: \"f326dfe8-582f-44e7-9030-8bbfbf4ccb68\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.524979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-db-sync-config-data\") pod \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\" (UID: \"3a2cf721-9a8b-49ab-9e57-1337f407db4f\") " Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.526305 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3fb185b-2bb2-4cc2-8572-b38db5027edb-logs" (OuterVolumeSpecName: "logs") pod "c3fb185b-2bb2-4cc2-8572-b38db5027edb" (UID: "c3fb185b-2bb2-4cc2-8572-b38db5027edb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.529711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-scripts" (OuterVolumeSpecName: "scripts") pod "f326dfe8-582f-44e7-9030-8bbfbf4ccb68" (UID: "f326dfe8-582f-44e7-9030-8bbfbf4ccb68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.539734 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-kube-api-access-jr25g" (OuterVolumeSpecName: "kube-api-access-jr25g") pod "f326dfe8-582f-44e7-9030-8bbfbf4ccb68" (UID: "f326dfe8-582f-44e7-9030-8bbfbf4ccb68"). InnerVolumeSpecName "kube-api-access-jr25g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.540307 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f326dfe8-582f-44e7-9030-8bbfbf4ccb68" (UID: "f326dfe8-582f-44e7-9030-8bbfbf4ccb68"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.540620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3a2cf721-9a8b-49ab-9e57-1337f407db4f" (UID: "3a2cf721-9a8b-49ab-9e57-1337f407db4f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.540762 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f326dfe8-582f-44e7-9030-8bbfbf4ccb68" (UID: "f326dfe8-582f-44e7-9030-8bbfbf4ccb68"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.540877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3fb185b-2bb2-4cc2-8572-b38db5027edb-kube-api-access-fvznl" (OuterVolumeSpecName: "kube-api-access-fvznl") pod "c3fb185b-2bb2-4cc2-8572-b38db5027edb" (UID: "c3fb185b-2bb2-4cc2-8572-b38db5027edb"). InnerVolumeSpecName "kube-api-access-fvznl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.541177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2cf721-9a8b-49ab-9e57-1337f407db4f-kube-api-access-m9r9b" (OuterVolumeSpecName: "kube-api-access-m9r9b") pod "3a2cf721-9a8b-49ab-9e57-1337f407db4f" (UID: "3a2cf721-9a8b-49ab-9e57-1337f407db4f"). InnerVolumeSpecName "kube-api-access-m9r9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.543431 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-scripts" (OuterVolumeSpecName: "scripts") pod "c3fb185b-2bb2-4cc2-8572-b38db5027edb" (UID: "c3fb185b-2bb2-4cc2-8572-b38db5027edb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.557269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a2cf721-9a8b-49ab-9e57-1337f407db4f" (UID: "3a2cf721-9a8b-49ab-9e57-1337f407db4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.566333 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-config-data" (OuterVolumeSpecName: "config-data") pod "c3fb185b-2bb2-4cc2-8572-b38db5027edb" (UID: "c3fb185b-2bb2-4cc2-8572-b38db5027edb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.571464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f326dfe8-582f-44e7-9030-8bbfbf4ccb68" (UID: "f326dfe8-582f-44e7-9030-8bbfbf4ccb68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.572883 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-config-data" (OuterVolumeSpecName: "config-data") pod "f326dfe8-582f-44e7-9030-8bbfbf4ccb68" (UID: "f326dfe8-582f-44e7-9030-8bbfbf4ccb68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.582337 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9gbdt" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.582633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9gbdt" event={"ID":"c3fb185b-2bb2-4cc2-8572-b38db5027edb","Type":"ContainerDied","Data":"582dd21d9e73b68736fa78e4bb97ee66ea66dd6a1fa27c3ad035d6f975397acf"} Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.582694 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="582dd21d9e73b68736fa78e4bb97ee66ea66dd6a1fa27c3ad035d6f975397acf" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.582820 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3fb185b-2bb2-4cc2-8572-b38db5027edb" (UID: "c3fb185b-2bb2-4cc2-8572-b38db5027edb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.585305 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gjzwn" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.585303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gjzwn" event={"ID":"3a2cf721-9a8b-49ab-9e57-1337f407db4f","Type":"ContainerDied","Data":"2caa257e8152804acc902d3ea3ca145fb6d03484778d9d9193b38869740a7af6"} Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.585485 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2caa257e8152804acc902d3ea3ca145fb6d03484778d9d9193b38869740a7af6" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.597896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8tm9r" event={"ID":"f326dfe8-582f-44e7-9030-8bbfbf4ccb68","Type":"ContainerDied","Data":"a07d3bdf9f2df62e8ec4fb4966475261904bcce9f8716ecd0bcb9974b8944069"} Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.597940 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a07d3bdf9f2df62e8ec4fb4966475261904bcce9f8716ecd0bcb9974b8944069" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.598009 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8tm9r" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.605677 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.605712 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.605725 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.605734 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626815 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr25g\" (UniqueName: \"kubernetes.io/projected/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-kube-api-access-jr25g\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626848 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626859 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626875 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626889 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9r9b\" (UniqueName: \"kubernetes.io/projected/3a2cf721-9a8b-49ab-9e57-1337f407db4f-kube-api-access-m9r9b\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626900 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626911 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626922 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3fb185b-2bb2-4cc2-8572-b38db5027edb-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626933 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f326dfe8-582f-44e7-9030-8bbfbf4ccb68-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626945 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626954 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626967 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3fb185b-2bb2-4cc2-8572-b38db5027edb-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626980 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvznl\" (UniqueName: \"kubernetes.io/projected/c3fb185b-2bb2-4cc2-8572-b38db5027edb-kube-api-access-fvznl\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:35 crc kubenswrapper[4707]: I0129 03:46:35.626994 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2cf721-9a8b-49ab-9e57-1337f407db4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.585151 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6568fdcd45-j5nxz"] Jan 29 03:46:36 crc kubenswrapper[4707]: E0129 03:46:36.586385 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fb185b-2bb2-4cc2-8572-b38db5027edb" containerName="placement-db-sync" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.586401 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fb185b-2bb2-4cc2-8572-b38db5027edb" containerName="placement-db-sync" Jan 29 03:46:36 crc kubenswrapper[4707]: E0129 03:46:36.586420 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2cf721-9a8b-49ab-9e57-1337f407db4f" containerName="barbican-db-sync" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.586425 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2cf721-9a8b-49ab-9e57-1337f407db4f" containerName="barbican-db-sync" Jan 29 03:46:36 crc kubenswrapper[4707]: E0129 03:46:36.586447 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f326dfe8-582f-44e7-9030-8bbfbf4ccb68" containerName="keystone-bootstrap" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.586454 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f326dfe8-582f-44e7-9030-8bbfbf4ccb68" containerName="keystone-bootstrap" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.586667 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3fb185b-2bb2-4cc2-8572-b38db5027edb" containerName="placement-db-sync" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.586687 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2cf721-9a8b-49ab-9e57-1337f407db4f" containerName="barbican-db-sync" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.586696 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f326dfe8-582f-44e7-9030-8bbfbf4ccb68" containerName="keystone-bootstrap" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.587386 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.590905 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.591716 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.591938 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.592072 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.592197 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdlvf" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.592363 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.610550 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6568fdcd45-j5nxz"] Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.632276 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-54fd6b997b-lbt29"] Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.658312 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-config-data\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.658404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-combined-ca-bundle\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.658430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-scripts\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.658494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-fernet-keys\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.658898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-public-tls-certs\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.658978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qnnf\" (UniqueName: \"kubernetes.io/projected/0542ad30-5c42-4464-83b9-3faebd15a9ea-kube-api-access-2qnnf\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.659032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-internal-tls-certs\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.659116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-credential-keys\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.668267 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.668474 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.668547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6694d87b67-rdpz4" event={"ID":"2500de98-6ed6-4399-889c-a397807fcd52","Type":"ContainerStarted","Data":"5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532"} Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.668613 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54fd6b997b-lbt29"] Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.669883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"cf0424c462083ab78d6f7ecb618cb72c194589faa9e4ad5d9079c9263ae5a027"} Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.676751 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.677151 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-c7t8z" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.677326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.677832 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.678736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fef3f370-7e00-4c47-be4a-23a8919e0c89","Type":"ContainerStarted","Data":"4ad2bed2ae1b1a4a31120fee2c5d39a110cad6f6f11d5bfa58db6084bab6df69"} Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.684671 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.694714 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6694d87b67-rdpz4" podStartSLOduration=7.69469058 podStartE2EDuration="7.69469058s" podCreationTimestamp="2026-01-29 03:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:36.668331449 +0000 UTC m=+1150.152560354" watchObservedRunningTime="2026-01-29 03:46:36.69469058 +0000 UTC m=+1150.178919485" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-public-tls-certs\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762404 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-fernet-keys\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762442 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-public-tls-certs\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/396ac802-02ea-480d-bd17-d18d64e8958f-logs\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-internal-tls-certs\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-combined-ca-bundle\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762637 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qnnf\" (UniqueName: \"kubernetes.io/projected/0542ad30-5c42-4464-83b9-3faebd15a9ea-kube-api-access-2qnnf\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762680 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-internal-tls-certs\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762735 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-scripts\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-config-data\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-credential-keys\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762816 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h8db\" (UniqueName: \"kubernetes.io/projected/396ac802-02ea-480d-bd17-d18d64e8958f-kube-api-access-2h8db\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762854 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-config-data\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-combined-ca-bundle\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.762900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-scripts\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.784029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-fernet-keys\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.791481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-credential-keys\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.792518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-combined-ca-bundle\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.796979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-config-data\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.798633 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-scripts\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.799121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-internal-tls-certs\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.816333 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0542ad30-5c42-4464-83b9-3faebd15a9ea-public-tls-certs\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.835469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qnnf\" (UniqueName: \"kubernetes.io/projected/0542ad30-5c42-4464-83b9-3faebd15a9ea-kube-api-access-2qnnf\") pod \"keystone-6568fdcd45-j5nxz\" (UID: \"0542ad30-5c42-4464-83b9-3faebd15a9ea\") " pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.835568 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-f8cf9f5b7-gdtm6"] Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.837239 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.852404 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-px95z" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.852665 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.852795 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.867984 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f8cf9f5b7-gdtm6"] Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.870471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/396ac802-02ea-480d-bd17-d18d64e8958f-logs\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.870634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-internal-tls-certs\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.870717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-combined-ca-bundle\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.875749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-scripts\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.875852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-config-data\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.875982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h8db\" (UniqueName: \"kubernetes.io/projected/396ac802-02ea-480d-bd17-d18d64e8958f-kube-api-access-2h8db\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.876188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-public-tls-certs\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.885033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-internal-tls-certs\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.885896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-scripts\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.887522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/396ac802-02ea-480d-bd17-d18d64e8958f-logs\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.889142 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-744498df46-44wwg"] Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.893033 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.896249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-public-tls-certs\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.898159 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.909429 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-744498df46-44wwg"] Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.912381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-combined-ca-bundle\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.935207 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.943009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-config-data\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.962257 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ggj2r"] Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.962570 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" podUID="f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" containerName="dnsmasq-dns" containerID="cri-o://3fd4b150cc3cce0225f9c529689fb7979f1c4d6d86140fee252a91e3a3dc83d1" gracePeriod=10 Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.976846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h8db\" (UniqueName: \"kubernetes.io/projected/396ac802-02ea-480d-bd17-d18d64e8958f-kube-api-access-2h8db\") pod \"placement-54fd6b997b-lbt29\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.979239 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.990229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c77a3e-5181-4e8a-b896-16d6b80971a7-logs\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.990320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.990407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-combined-ca-bundle\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.990441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data-custom\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.990742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh29r\" (UniqueName: \"kubernetes.io/projected/48c77a3e-5181-4e8a-b896-16d6b80971a7-kube-api-access-hh29r\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.990768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data-custom\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.990859 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-logs\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.990879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-combined-ca-bundle\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.990969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbdr2\" (UniqueName: \"kubernetes.io/projected/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-kube-api-access-lbdr2\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:36 crc kubenswrapper[4707]: I0129 03:46:36.991001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.027799 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.078920 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-pj4lj"] Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.080614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.095833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh29r\" (UniqueName: \"kubernetes.io/projected/48c77a3e-5181-4e8a-b896-16d6b80971a7-kube-api-access-hh29r\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.095884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data-custom\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.095933 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-logs\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.095958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-combined-ca-bundle\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.096011 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbdr2\" (UniqueName: \"kubernetes.io/projected/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-kube-api-access-lbdr2\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.096046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.096108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c77a3e-5181-4e8a-b896-16d6b80971a7-logs\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.096127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.096159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-combined-ca-bundle\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.096181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data-custom\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.110047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-logs\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.119077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c77a3e-5181-4e8a-b896-16d6b80971a7-logs\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.139823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data-custom\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.140210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-combined-ca-bundle\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.140470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data-custom\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.140882 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.141332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-combined-ca-bundle\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.145966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.160470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh29r\" (UniqueName: \"kubernetes.io/projected/48c77a3e-5181-4e8a-b896-16d6b80971a7-kube-api-access-hh29r\") pod \"barbican-keystone-listener-744498df46-44wwg\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.161594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbdr2\" (UniqueName: \"kubernetes.io/projected/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-kube-api-access-lbdr2\") pod \"barbican-worker-f8cf9f5b7-gdtm6\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.199147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jhw4\" (UniqueName: \"kubernetes.io/projected/07580a17-d3c5-4103-a8a5-cb85569104ec-kube-api-access-6jhw4\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.199492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-svc\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.199671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-config\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.199804 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.199922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.200098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.216638 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-pj4lj"] Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.247349 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-78bfcf785f-txfhz"] Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.250134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.286636 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7665bc55c6-vnwk8"] Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.288120 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7665bc55c6-vnwk8"] Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.288139 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-78bfcf785f-txfhz"] Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.288229 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.289708 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-774fc4cdc8-zk6d7"] Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.292474 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.301338 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.319441 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jhw4\" (UniqueName: \"kubernetes.io/projected/07580a17-d3c5-4103-a8a5-cb85569104ec-kube-api-access-6jhw4\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.333137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-svc\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.333240 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-config\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.333296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.333327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.333602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.322655 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-774fc4cdc8-zk6d7"] Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.335377 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-svc\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.336563 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.336836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.339315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.340112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-config\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.347239 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.369746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jhw4\" (UniqueName: \"kubernetes.io/projected/07580a17-d3c5-4103-a8a5-cb85569104ec-kube-api-access-6jhw4\") pod \"dnsmasq-dns-85ff748b95-pj4lj\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.398285 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.473850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd6d292d-51ed-4b96-89e1-06220cd5f98b-config-data\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.473918 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-combined-ca-bundle\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plbj9\" (UniqueName: \"kubernetes.io/projected/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-kube-api-access-plbj9\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/374bee16-aeed-4b53-845a-494375d065f6-logs\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474045 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd6d292d-51ed-4b96-89e1-06220cd5f98b-config-data-custom\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474069 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-logs\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6lqk\" (UniqueName: \"kubernetes.io/projected/374bee16-aeed-4b53-845a-494375d065f6-kube-api-access-p6lqk\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/374bee16-aeed-4b53-845a-494375d065f6-config-data\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/374bee16-aeed-4b53-845a-494375d065f6-combined-ca-bundle\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/374bee16-aeed-4b53-845a-494375d065f6-config-data-custom\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474212 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd6d292d-51ed-4b96-89e1-06220cd5f98b-logs\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd6d292d-51ed-4b96-89e1-06220cd5f98b-combined-ca-bundle\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data-custom\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.474279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2598\" (UniqueName: \"kubernetes.io/projected/cd6d292d-51ed-4b96-89e1-06220cd5f98b-kube-api-access-r2598\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.505131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.579284 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plbj9\" (UniqueName: \"kubernetes.io/projected/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-kube-api-access-plbj9\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.583620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/374bee16-aeed-4b53-845a-494375d065f6-logs\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.583837 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd6d292d-51ed-4b96-89e1-06220cd5f98b-config-data-custom\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.584017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-logs\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.585343 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6lqk\" (UniqueName: \"kubernetes.io/projected/374bee16-aeed-4b53-845a-494375d065f6-kube-api-access-p6lqk\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.585554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/374bee16-aeed-4b53-845a-494375d065f6-config-data\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.586617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/374bee16-aeed-4b53-845a-494375d065f6-combined-ca-bundle\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.588085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/374bee16-aeed-4b53-845a-494375d065f6-config-data-custom\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.588171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.588261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd6d292d-51ed-4b96-89e1-06220cd5f98b-logs\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.588330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd6d292d-51ed-4b96-89e1-06220cd5f98b-combined-ca-bundle\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.588482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data-custom\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.592691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2598\" (UniqueName: \"kubernetes.io/projected/cd6d292d-51ed-4b96-89e1-06220cd5f98b-kube-api-access-r2598\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.592957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd6d292d-51ed-4b96-89e1-06220cd5f98b-config-data\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.593100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-combined-ca-bundle\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.591836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd6d292d-51ed-4b96-89e1-06220cd5f98b-logs\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.595611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-logs\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.590970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/374bee16-aeed-4b53-845a-494375d065f6-logs\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.604221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/374bee16-aeed-4b53-845a-494375d065f6-config-data\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.608045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd6d292d-51ed-4b96-89e1-06220cd5f98b-config-data\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.615036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data-custom\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.615694 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd6d292d-51ed-4b96-89e1-06220cd5f98b-combined-ca-bundle\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.616124 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd6d292d-51ed-4b96-89e1-06220cd5f98b-config-data-custom\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.616365 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/374bee16-aeed-4b53-845a-494375d065f6-config-data-custom\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.616585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/374bee16-aeed-4b53-845a-494375d065f6-combined-ca-bundle\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.621262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plbj9\" (UniqueName: \"kubernetes.io/projected/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-kube-api-access-plbj9\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.621625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-combined-ca-bundle\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.623255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2598\" (UniqueName: \"kubernetes.io/projected/cd6d292d-51ed-4b96-89e1-06220cd5f98b-kube-api-access-r2598\") pod \"barbican-worker-78bfcf785f-txfhz\" (UID: \"cd6d292d-51ed-4b96-89e1-06220cd5f98b\") " pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.623886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6lqk\" (UniqueName: \"kubernetes.io/projected/374bee16-aeed-4b53-845a-494375d065f6-kube-api-access-p6lqk\") pod \"barbican-keystone-listener-7665bc55c6-vnwk8\" (UID: \"374bee16-aeed-4b53-845a-494375d065f6\") " pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.629699 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data\") pod \"barbican-api-774fc4cdc8-zk6d7\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.736816 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" containerID="3fd4b150cc3cce0225f9c529689fb7979f1c4d6d86140fee252a91e3a3dc83d1" exitCode=0 Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.737490 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" event={"ID":"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e","Type":"ContainerDied","Data":"3fd4b150cc3cce0225f9c529689fb7979f1c4d6d86140fee252a91e3a3dc83d1"} Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.737666 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.737695 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.844270 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-78bfcf785f-txfhz" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.860345 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" Jan 29 03:46:37 crc kubenswrapper[4707]: I0129 03:46:37.882841 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.080774 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.111639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-swift-storage-0\") pod \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.112231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-nb\") pod \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.112300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-config\") pod \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.112601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-svc\") pod \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.112665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44c98\" (UniqueName: \"kubernetes.io/projected/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-kube-api-access-44c98\") pod \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.112716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-sb\") pod \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\" (UID: \"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e\") " Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.162245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-kube-api-access-44c98" (OuterVolumeSpecName: "kube-api-access-44c98") pod "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" (UID: "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e"). InnerVolumeSpecName "kube-api-access-44c98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.217329 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44c98\" (UniqueName: \"kubernetes.io/projected/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-kube-api-access-44c98\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.307222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" (UID: "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.321215 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" (UID: "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.323799 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.323834 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.327588 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-config" (OuterVolumeSpecName: "config") pod "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" (UID: "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.351624 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6568fdcd45-j5nxz"] Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.353589 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" (UID: "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.368865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" (UID: "f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.376867 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54fd6b997b-lbt29"] Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.408374 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-744498df46-44wwg"] Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.442491 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.442527 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.442627 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:38 crc kubenswrapper[4707]: W0129 03:46:38.443410 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod396ac802_02ea_480d_bd17_d18d64e8958f.slice/crio-ee3314c54a0165b6ed87367e46d3e531aa594c1d5a33502479307545c5317161 WatchSource:0}: Error finding container ee3314c54a0165b6ed87367e46d3e531aa594c1d5a33502479307545c5317161: Status 404 returned error can't find the container with id ee3314c54a0165b6ed87367e46d3e531aa594c1d5a33502479307545c5317161 Jan 29 03:46:38 crc kubenswrapper[4707]: W0129 03:46:38.453903 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c77a3e_5181_4e8a_b896_16d6b80971a7.slice/crio-3b20dd77b8e87b342262d49bd1773d39a5fb18866c6c0cd17defcc452e59f188 WatchSource:0}: Error finding container 3b20dd77b8e87b342262d49bd1773d39a5fb18866c6c0cd17defcc452e59f188: Status 404 returned error can't find the container with id 3b20dd77b8e87b342262d49bd1773d39a5fb18866c6c0cd17defcc452e59f188 Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.496504 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 03:46:38 crc kubenswrapper[4707]: I0129 03:46:38.750362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54fd6b997b-lbt29" event={"ID":"396ac802-02ea-480d-bd17-d18d64e8958f","Type":"ContainerStarted","Data":"ee3314c54a0165b6ed87367e46d3e531aa594c1d5a33502479307545c5317161"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:38.752405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-744498df46-44wwg" event={"ID":"48c77a3e-5181-4e8a-b896-16d6b80971a7","Type":"ContainerStarted","Data":"3b20dd77b8e87b342262d49bd1773d39a5fb18866c6c0cd17defcc452e59f188"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:38.755216 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6568fdcd45-j5nxz" event={"ID":"0542ad30-5c42-4464-83b9-3faebd15a9ea","Type":"ContainerStarted","Data":"a498223a91cdbe423690c4e74d58c3235f0908961c7a505b8b3b04ad6b18f1e8"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:38.757174 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:38.757485 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:38.757506 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-ggj2r" event={"ID":"f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e","Type":"ContainerDied","Data":"43c1d2084a044fdf80c1ac1661cde18c885c245e1bd659e48a296f4def2ee420"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:38.757668 4707 scope.go:117] "RemoveContainer" containerID="3fd4b150cc3cce0225f9c529689fb7979f1c4d6d86140fee252a91e3a3dc83d1" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:38.991586 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:38.998919 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-78bfcf785f-txfhz"] Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.013486 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.013636 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.014264 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.067523 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-pj4lj"] Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.095835 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f8cf9f5b7-gdtm6"] Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.113507 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7665bc55c6-vnwk8"] Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.146472 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ggj2r"] Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.153073 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-ggj2r"] Jan 29 03:46:39 crc kubenswrapper[4707]: W0129 03:46:39.164168 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod374bee16_aeed_4b53_845a_494375d065f6.slice/crio-0aa21a7a25b07c4a190c6ff75068dd5e3f9c2ab7ef29d8a35e45941c182d5d8a WatchSource:0}: Error finding container 0aa21a7a25b07c4a190c6ff75068dd5e3f9c2ab7ef29d8a35e45941c182d5d8a: Status 404 returned error can't find the container with id 0aa21a7a25b07c4a190c6ff75068dd5e3f9c2ab7ef29d8a35e45941c182d5d8a Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.198378 4707 scope.go:117] "RemoveContainer" containerID="88b999b2ef9b8b4ec97ebb9acbaf51674a8787c3470c3d43d40ef6323612fe81" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.371044 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" path="/var/lib/kubelet/pods/f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e/volumes" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.488455 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-774fc4cdc8-zk6d7"] Jan 29 03:46:39 crc kubenswrapper[4707]: W0129 03:46:39.506736 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe340738_ebb6_4cc0_8942_e6c0f6d59f6a.slice/crio-157bbf20712c650909465337fd0350d744656ffefbf421e4bc5d40aeaaebc80f WatchSource:0}: Error finding container 157bbf20712c650909465337fd0350d744656ffefbf421e4bc5d40aeaaebc80f: Status 404 returned error can't find the container with id 157bbf20712c650909465337fd0350d744656ffefbf421e4bc5d40aeaaebc80f Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.781712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78bfcf785f-txfhz" event={"ID":"cd6d292d-51ed-4b96-89e1-06220cd5f98b","Type":"ContainerStarted","Data":"d285e97a0422485f2f51dfd9c013328e28e7cc29d1e73068dfc3f7fe8631dc76"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.785961 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" event={"ID":"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa","Type":"ContainerStarted","Data":"e7911a5a16c4b404166c7d373d38bb503eb38b0b13c614e614d7136b05a53391"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.787719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6568fdcd45-j5nxz" event={"ID":"0542ad30-5c42-4464-83b9-3faebd15a9ea","Type":"ContainerStarted","Data":"93365d14ce26899626d531f71d58f910869ff73820ee933567013e1b83e6d09b"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.787858 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.792810 4707 generic.go:334] "Generic (PLEG): container finished" podID="07580a17-d3c5-4103-a8a5-cb85569104ec" containerID="d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61" exitCode=0 Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.793136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" event={"ID":"07580a17-d3c5-4103-a8a5-cb85569104ec","Type":"ContainerDied","Data":"d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.793175 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" event={"ID":"07580a17-d3c5-4103-a8a5-cb85569104ec","Type":"ContainerStarted","Data":"4bbe2cf824179ad7fb088a0d81e46e30cdfff3ee1f893b9fc718b5e8f9e2c915"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.807050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54fd6b997b-lbt29" event={"ID":"396ac802-02ea-480d-bd17-d18d64e8958f","Type":"ContainerStarted","Data":"b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.810609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-774fc4cdc8-zk6d7" event={"ID":"be340738-ebb6-4cc0-8942-e6c0f6d59f6a","Type":"ContainerStarted","Data":"157bbf20712c650909465337fd0350d744656ffefbf421e4bc5d40aeaaebc80f"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.813565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" event={"ID":"374bee16-aeed-4b53-845a-494375d065f6","Type":"ContainerStarted","Data":"0aa21a7a25b07c4a190c6ff75068dd5e3f9c2ab7ef29d8a35e45941c182d5d8a"} Jan 29 03:46:39 crc kubenswrapper[4707]: I0129 03:46:39.849437 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6568fdcd45-j5nxz" podStartSLOduration=3.849412931 podStartE2EDuration="3.849412931s" podCreationTimestamp="2026-01-29 03:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:39.819649853 +0000 UTC m=+1153.303878758" watchObservedRunningTime="2026-01-29 03:46:39.849412931 +0000 UTC m=+1153.333641856" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.662554 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6cf5fb45fd-lqs99"] Jan 29 03:46:40 crc kubenswrapper[4707]: E0129 03:46:40.663430 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" containerName="init" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.663447 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" containerName="init" Jan 29 03:46:40 crc kubenswrapper[4707]: E0129 03:46:40.663476 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" containerName="dnsmasq-dns" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.663483 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" containerName="dnsmasq-dns" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.663717 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d44e5e-7bf6-4478-91bf-bd5c9b161b6e" containerName="dnsmasq-dns" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.664784 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.668473 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.668677 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.732098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbtsk\" (UniqueName: \"kubernetes.io/projected/27a75d55-3866-4c55-bffb-8f1f1d53b687-kube-api-access-dbtsk\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.732186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-config-data\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.732521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-config-data-custom\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.732601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-combined-ca-bundle\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.732649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27a75d55-3866-4c55-bffb-8f1f1d53b687-logs\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.732714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-internal-tls-certs\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.732745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-public-tls-certs\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.733669 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cf5fb45fd-lqs99"] Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.828362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54fd6b997b-lbt29" event={"ID":"396ac802-02ea-480d-bd17-d18d64e8958f","Type":"ContainerStarted","Data":"523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f"} Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.829962 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.830109 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.832131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-774fc4cdc8-zk6d7" event={"ID":"be340738-ebb6-4cc0-8942-e6c0f6d59f6a","Type":"ContainerStarted","Data":"575f6d164bfcdce937417b5d5e8e692dae47e61e20ebca2e68051b469d4975c8"} Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.834308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-combined-ca-bundle\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.834389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27a75d55-3866-4c55-bffb-8f1f1d53b687-logs\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.834434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-internal-tls-certs\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.834465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-public-tls-certs\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.834523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbtsk\" (UniqueName: \"kubernetes.io/projected/27a75d55-3866-4c55-bffb-8f1f1d53b687-kube-api-access-dbtsk\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.834605 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-config-data\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.834698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-config-data-custom\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.836003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wbtsz" event={"ID":"6808d614-6634-4b2a-9e78-7480a0921415","Type":"ContainerStarted","Data":"8407fb29d0fe3d3fb94280f256eefd1b6681ef7b3ba4a22423a6518dfeffdd18"} Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.836465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27a75d55-3866-4c55-bffb-8f1f1d53b687-logs\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.839913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-config-data-custom\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.842074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-combined-ca-bundle\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.843840 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-config-data\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.844224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-public-tls-certs\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.857831 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-54fd6b997b-lbt29" podStartSLOduration=4.857818398 podStartE2EDuration="4.857818398s" podCreationTimestamp="2026-01-29 03:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:40.850898571 +0000 UTC m=+1154.335127476" watchObservedRunningTime="2026-01-29 03:46:40.857818398 +0000 UTC m=+1154.342047303" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.863245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a75d55-3866-4c55-bffb-8f1f1d53b687-internal-tls-certs\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.872727 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-wbtsz" podStartSLOduration=3.891222402 podStartE2EDuration="46.872706272s" podCreationTimestamp="2026-01-29 03:45:54 +0000 UTC" firstStartedPulling="2026-01-29 03:45:56.34119087 +0000 UTC m=+1109.825419775" lastFinishedPulling="2026-01-29 03:46:39.32267474 +0000 UTC m=+1152.806903645" observedRunningTime="2026-01-29 03:46:40.868832392 +0000 UTC m=+1154.353061297" watchObservedRunningTime="2026-01-29 03:46:40.872706272 +0000 UTC m=+1154.356935177" Jan 29 03:46:40 crc kubenswrapper[4707]: I0129 03:46:40.875405 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbtsk\" (UniqueName: \"kubernetes.io/projected/27a75d55-3866-4c55-bffb-8f1f1d53b687-kube-api-access-dbtsk\") pod \"barbican-api-6cf5fb45fd-lqs99\" (UID: \"27a75d55-3866-4c55-bffb-8f1f1d53b687\") " pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:41 crc kubenswrapper[4707]: I0129 03:46:41.056412 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:41 crc kubenswrapper[4707]: I0129 03:46:41.857327 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cf5fb45fd-lqs99"] Jan 29 03:46:41 crc kubenswrapper[4707]: I0129 03:46:41.862680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cf5fb45fd-lqs99" event={"ID":"27a75d55-3866-4c55-bffb-8f1f1d53b687","Type":"ContainerStarted","Data":"695df463260fdf73a3314eb5993885b4ae6d23cc7e473cb725c4c936e2b23d9b"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.896217 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" event={"ID":"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa","Type":"ContainerStarted","Data":"2736d36134ffeeef1be6185a6f41beb8e55f7f030bd9491c40b9fdeb3ba8a72a"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.897088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" event={"ID":"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa","Type":"ContainerStarted","Data":"d5a42bbcf90b49ddb0508a4c9e4f39fe661a2fd12a4886410d2c5c1836485352"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.918056 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" podStartSLOduration=4.792220858 podStartE2EDuration="6.918033228s" podCreationTimestamp="2026-01-29 03:46:36 +0000 UTC" firstStartedPulling="2026-01-29 03:46:39.156850915 +0000 UTC m=+1152.641079820" lastFinishedPulling="2026-01-29 03:46:41.282663285 +0000 UTC m=+1154.766892190" observedRunningTime="2026-01-29 03:46:42.914846687 +0000 UTC m=+1156.399075592" watchObservedRunningTime="2026-01-29 03:46:42.918033228 +0000 UTC m=+1156.402262133" Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.922944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" event={"ID":"07580a17-d3c5-4103-a8a5-cb85569104ec","Type":"ContainerStarted","Data":"1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.923038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.934190 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-774fc4cdc8-zk6d7" event={"ID":"be340738-ebb6-4cc0-8942-e6c0f6d59f6a","Type":"ContainerStarted","Data":"5da4aca2b346c268062806371ccff5c4ef359dd215e0ac019744dc7bc7333680"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.935214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.935246 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.941167 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" event={"ID":"374bee16-aeed-4b53-845a-494375d065f6","Type":"ContainerStarted","Data":"6e8ebbb95c2541c01bfb430ee44897f751c5a2c7622573cd7374a14e56483ecf"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.941204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" event={"ID":"374bee16-aeed-4b53-845a-494375d065f6","Type":"ContainerStarted","Data":"a6f41584d88af8b6a30c4e7e1396ff4fb1dd52cae0f22aca81907ef1384e20cd"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.945143 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-744498df46-44wwg" event={"ID":"48c77a3e-5181-4e8a-b896-16d6b80971a7","Type":"ContainerStarted","Data":"d4cba128e6c7b1b7549fa9332dcd7e8e7a2fdd488ae00d0c497da92cfd00eff8"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.945176 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-744498df46-44wwg" event={"ID":"48c77a3e-5181-4e8a-b896-16d6b80971a7","Type":"ContainerStarted","Data":"50b3aaa904f5b278cfe02b2ed00c3063c0056a05d58310dba367e2894a99f315"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.955423 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" podStartSLOduration=6.955398833 podStartE2EDuration="6.955398833s" podCreationTimestamp="2026-01-29 03:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:42.943754051 +0000 UTC m=+1156.427982956" watchObservedRunningTime="2026-01-29 03:46:42.955398833 +0000 UTC m=+1156.439627738" Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.956801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cf5fb45fd-lqs99" event={"ID":"27a75d55-3866-4c55-bffb-8f1f1d53b687","Type":"ContainerStarted","Data":"5b4fc361bb3dc0d71c49fc4b7e1dcba222b140b6fefe325c042885af0c913580"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.956862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cf5fb45fd-lqs99" event={"ID":"27a75d55-3866-4c55-bffb-8f1f1d53b687","Type":"ContainerStarted","Data":"b7c866f0dd347842149835591452cce34ca740901f25225d4decc73807221ec8"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.956896 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.972808 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gzpxq" event={"ID":"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a","Type":"ContainerStarted","Data":"c16abc9e22c3d3d572f7c87b2770135394cb1ead6a210a0738b60f95009fa4c1"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.977270 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7665bc55c6-vnwk8" podStartSLOduration=3.899519027 podStartE2EDuration="5.977238966s" podCreationTimestamp="2026-01-29 03:46:37 +0000 UTC" firstStartedPulling="2026-01-29 03:46:39.204971077 +0000 UTC m=+1152.689199982" lastFinishedPulling="2026-01-29 03:46:41.282691006 +0000 UTC m=+1154.766919921" observedRunningTime="2026-01-29 03:46:42.967498408 +0000 UTC m=+1156.451727313" watchObservedRunningTime="2026-01-29 03:46:42.977238966 +0000 UTC m=+1156.461467871" Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.999735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78bfcf785f-txfhz" event={"ID":"cd6d292d-51ed-4b96-89e1-06220cd5f98b","Type":"ContainerStarted","Data":"b7fe3426d5ffde1320801f6909de7248f425f3ada777b481e2997211a8828b47"} Jan 29 03:46:42 crc kubenswrapper[4707]: I0129 03:46:42.999806 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78bfcf785f-txfhz" event={"ID":"cd6d292d-51ed-4b96-89e1-06220cd5f98b","Type":"ContainerStarted","Data":"af600a78c60c09f8a259e586c480f177ef560d0edd64d1f7bbc9217955780ef8"} Jan 29 03:46:43 crc kubenswrapper[4707]: I0129 03:46:43.023636 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-744498df46-44wwg"] Jan 29 03:46:43 crc kubenswrapper[4707]: I0129 03:46:43.033781 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-774fc4cdc8-zk6d7" podStartSLOduration=6.033754396 podStartE2EDuration="6.033754396s" podCreationTimestamp="2026-01-29 03:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:43.011054939 +0000 UTC m=+1156.495283844" watchObservedRunningTime="2026-01-29 03:46:43.033754396 +0000 UTC m=+1156.517983291" Jan 29 03:46:43 crc kubenswrapper[4707]: I0129 03:46:43.047960 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6cf5fb45fd-lqs99" podStartSLOduration=3.04793856 podStartE2EDuration="3.04793856s" podCreationTimestamp="2026-01-29 03:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:43.040120328 +0000 UTC m=+1156.524349233" watchObservedRunningTime="2026-01-29 03:46:43.04793856 +0000 UTC m=+1156.532167465" Jan 29 03:46:43 crc kubenswrapper[4707]: I0129 03:46:43.099566 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gzpxq" podStartSLOduration=4.650802398 podStartE2EDuration="49.09951056s" podCreationTimestamp="2026-01-29 03:45:54 +0000 UTC" firstStartedPulling="2026-01-29 03:45:56.932895113 +0000 UTC m=+1110.417124018" lastFinishedPulling="2026-01-29 03:46:41.381603275 +0000 UTC m=+1154.865832180" observedRunningTime="2026-01-29 03:46:43.091868652 +0000 UTC m=+1156.576097557" watchObservedRunningTime="2026-01-29 03:46:43.09951056 +0000 UTC m=+1156.583739465" Jan 29 03:46:43 crc kubenswrapper[4707]: I0129 03:46:43.144607 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-744498df46-44wwg" podStartSLOduration=4.264309444 podStartE2EDuration="7.144581314s" podCreationTimestamp="2026-01-29 03:46:36 +0000 UTC" firstStartedPulling="2026-01-29 03:46:38.462626602 +0000 UTC m=+1151.946855507" lastFinishedPulling="2026-01-29 03:46:41.342898472 +0000 UTC m=+1154.827127377" observedRunningTime="2026-01-29 03:46:43.131956935 +0000 UTC m=+1156.616185840" watchObservedRunningTime="2026-01-29 03:46:43.144581314 +0000 UTC m=+1156.628810219" Jan 29 03:46:43 crc kubenswrapper[4707]: I0129 03:46:43.157244 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-78bfcf785f-txfhz" podStartSLOduration=4.028205554 podStartE2EDuration="6.157210864s" podCreationTimestamp="2026-01-29 03:46:37 +0000 UTC" firstStartedPulling="2026-01-29 03:46:39.156862316 +0000 UTC m=+1152.641091221" lastFinishedPulling="2026-01-29 03:46:41.285867606 +0000 UTC m=+1154.770096531" observedRunningTime="2026-01-29 03:46:43.152826149 +0000 UTC m=+1156.637055054" watchObservedRunningTime="2026-01-29 03:46:43.157210864 +0000 UTC m=+1156.641439769" Jan 29 03:46:43 crc kubenswrapper[4707]: I0129 03:46:43.207111 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-f8cf9f5b7-gdtm6"] Jan 29 03:46:44 crc kubenswrapper[4707]: I0129 03:46:44.013762 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:45 crc kubenswrapper[4707]: I0129 03:46:45.018491 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-744498df46-44wwg" podUID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerName="barbican-keystone-listener-log" containerID="cri-o://50b3aaa904f5b278cfe02b2ed00c3063c0056a05d58310dba367e2894a99f315" gracePeriod=30 Jan 29 03:46:45 crc kubenswrapper[4707]: I0129 03:46:45.018597 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-744498df46-44wwg" podUID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerName="barbican-keystone-listener" containerID="cri-o://d4cba128e6c7b1b7549fa9332dcd7e8e7a2fdd488ae00d0c497da92cfd00eff8" gracePeriod=30 Jan 29 03:46:45 crc kubenswrapper[4707]: I0129 03:46:45.019239 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" podUID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerName="barbican-worker-log" containerID="cri-o://d5a42bbcf90b49ddb0508a4c9e4f39fe661a2fd12a4886410d2c5c1836485352" gracePeriod=30 Jan 29 03:46:45 crc kubenswrapper[4707]: I0129 03:46:45.019398 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" podUID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerName="barbican-worker" containerID="cri-o://2736d36134ffeeef1be6185a6f41beb8e55f7f030bd9491c40b9fdeb3ba8a72a" gracePeriod=30 Jan 29 03:46:45 crc kubenswrapper[4707]: I0129 03:46:45.635128 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.034955 4707 generic.go:334] "Generic (PLEG): container finished" podID="6808d614-6634-4b2a-9e78-7480a0921415" containerID="8407fb29d0fe3d3fb94280f256eefd1b6681ef7b3ba4a22423a6518dfeffdd18" exitCode=0 Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.035065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wbtsz" event={"ID":"6808d614-6634-4b2a-9e78-7480a0921415","Type":"ContainerDied","Data":"8407fb29d0fe3d3fb94280f256eefd1b6681ef7b3ba4a22423a6518dfeffdd18"} Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.039624 4707 generic.go:334] "Generic (PLEG): container finished" podID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerID="d4cba128e6c7b1b7549fa9332dcd7e8e7a2fdd488ae00d0c497da92cfd00eff8" exitCode=0 Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.039646 4707 generic.go:334] "Generic (PLEG): container finished" podID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerID="50b3aaa904f5b278cfe02b2ed00c3063c0056a05d58310dba367e2894a99f315" exitCode=143 Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.039666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-744498df46-44wwg" event={"ID":"48c77a3e-5181-4e8a-b896-16d6b80971a7","Type":"ContainerDied","Data":"d4cba128e6c7b1b7549fa9332dcd7e8e7a2fdd488ae00d0c497da92cfd00eff8"} Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.039716 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-744498df46-44wwg" event={"ID":"48c77a3e-5181-4e8a-b896-16d6b80971a7","Type":"ContainerDied","Data":"50b3aaa904f5b278cfe02b2ed00c3063c0056a05d58310dba367e2894a99f315"} Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.042565 4707 generic.go:334] "Generic (PLEG): container finished" podID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerID="2736d36134ffeeef1be6185a6f41beb8e55f7f030bd9491c40b9fdeb3ba8a72a" exitCode=0 Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.042589 4707 generic.go:334] "Generic (PLEG): container finished" podID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerID="d5a42bbcf90b49ddb0508a4c9e4f39fe661a2fd12a4886410d2c5c1836485352" exitCode=143 Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.042632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" event={"ID":"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa","Type":"ContainerDied","Data":"2736d36134ffeeef1be6185a6f41beb8e55f7f030bd9491c40b9fdeb3ba8a72a"} Jan 29 03:46:46 crc kubenswrapper[4707]: I0129 03:46:46.042686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" event={"ID":"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa","Type":"ContainerDied","Data":"d5a42bbcf90b49ddb0508a4c9e4f39fe661a2fd12a4886410d2c5c1836485352"} Jan 29 03:46:47 crc kubenswrapper[4707]: I0129 03:46:47.051958 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" containerID="c16abc9e22c3d3d572f7c87b2770135394cb1ead6a210a0738b60f95009fa4c1" exitCode=0 Jan 29 03:46:47 crc kubenswrapper[4707]: I0129 03:46:47.052452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gzpxq" event={"ID":"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a","Type":"ContainerDied","Data":"c16abc9e22c3d3d572f7c87b2770135394cb1ead6a210a0738b60f95009fa4c1"} Jan 29 03:46:47 crc kubenswrapper[4707]: I0129 03:46:47.515714 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:46:47 crc kubenswrapper[4707]: I0129 03:46:47.578653 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nqp7c"] Jan 29 03:46:47 crc kubenswrapper[4707]: I0129 03:46:47.578937 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" podUID="c6de908f-0466-47f1-9a1f-bd9a306e98ac" containerName="dnsmasq-dns" containerID="cri-o://1d84ccd48450f7d7325c0a65191a8ce44d451628148d6e9ef16535f4d32b2bb1" gracePeriod=10 Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.068859 4707 generic.go:334] "Generic (PLEG): container finished" podID="c6de908f-0466-47f1-9a1f-bd9a306e98ac" containerID="1d84ccd48450f7d7325c0a65191a8ce44d451628148d6e9ef16535f4d32b2bb1" exitCode=0 Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.069118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" event={"ID":"c6de908f-0466-47f1-9a1f-bd9a306e98ac","Type":"ContainerDied","Data":"1d84ccd48450f7d7325c0a65191a8ce44d451628148d6e9ef16535f4d32b2bb1"} Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.450261 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wbtsz" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.451955 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.452195 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.542974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data\") pod \"48c77a3e-5181-4e8a-b896-16d6b80971a7\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543050 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-combined-ca-bundle\") pod \"48c77a3e-5181-4e8a-b896-16d6b80971a7\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8qrk\" (UniqueName: \"kubernetes.io/projected/6808d614-6634-4b2a-9e78-7480a0921415-kube-api-access-b8qrk\") pod \"6808d614-6634-4b2a-9e78-7480a0921415\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543281 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-combined-ca-bundle\") pod \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data-custom\") pod \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543378 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data-custom\") pod \"48c77a3e-5181-4e8a-b896-16d6b80971a7\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-config-data\") pod \"6808d614-6634-4b2a-9e78-7480a0921415\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543445 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbdr2\" (UniqueName: \"kubernetes.io/projected/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-kube-api-access-lbdr2\") pod \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-logs\") pod \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh29r\" (UniqueName: \"kubernetes.io/projected/48c77a3e-5181-4e8a-b896-16d6b80971a7-kube-api-access-hh29r\") pod \"48c77a3e-5181-4e8a-b896-16d6b80971a7\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-combined-ca-bundle\") pod \"6808d614-6634-4b2a-9e78-7480a0921415\" (UID: \"6808d614-6634-4b2a-9e78-7480a0921415\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data\") pod \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\" (UID: \"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.543723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c77a3e-5181-4e8a-b896-16d6b80971a7-logs\") pod \"48c77a3e-5181-4e8a-b896-16d6b80971a7\" (UID: \"48c77a3e-5181-4e8a-b896-16d6b80971a7\") " Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.544930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c77a3e-5181-4e8a-b896-16d6b80971a7-logs" (OuterVolumeSpecName: "logs") pod "48c77a3e-5181-4e8a-b896-16d6b80971a7" (UID: "48c77a3e-5181-4e8a-b896-16d6b80971a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.549183 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-logs" (OuterVolumeSpecName: "logs") pod "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" (UID: "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.549839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-kube-api-access-lbdr2" (OuterVolumeSpecName: "kube-api-access-lbdr2") pod "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" (UID: "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa"). InnerVolumeSpecName "kube-api-access-lbdr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.559042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c77a3e-5181-4e8a-b896-16d6b80971a7-kube-api-access-hh29r" (OuterVolumeSpecName: "kube-api-access-hh29r") pod "48c77a3e-5181-4e8a-b896-16d6b80971a7" (UID: "48c77a3e-5181-4e8a-b896-16d6b80971a7"). InnerVolumeSpecName "kube-api-access-hh29r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.563659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "48c77a3e-5181-4e8a-b896-16d6b80971a7" (UID: "48c77a3e-5181-4e8a-b896-16d6b80971a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.564329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" (UID: "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.583595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6808d614-6634-4b2a-9e78-7480a0921415-kube-api-access-b8qrk" (OuterVolumeSpecName: "kube-api-access-b8qrk") pod "6808d614-6634-4b2a-9e78-7480a0921415" (UID: "6808d614-6634-4b2a-9e78-7480a0921415"). InnerVolumeSpecName "kube-api-access-b8qrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.598197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" (UID: "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.655521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6808d614-6634-4b2a-9e78-7480a0921415" (UID: "6808d614-6634-4b2a-9e78-7480a0921415"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.660213 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh29r\" (UniqueName: \"kubernetes.io/projected/48c77a3e-5181-4e8a-b896-16d6b80971a7-kube-api-access-hh29r\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.660246 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.660256 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48c77a3e-5181-4e8a-b896-16d6b80971a7-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.660268 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8qrk\" (UniqueName: \"kubernetes.io/projected/6808d614-6634-4b2a-9e78-7480a0921415-kube-api-access-b8qrk\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.660278 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.660286 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.660294 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.660309 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbdr2\" (UniqueName: \"kubernetes.io/projected/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-kube-api-access-lbdr2\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.660317 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.680893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data" (OuterVolumeSpecName: "config-data") pod "48c77a3e-5181-4e8a-b896-16d6b80971a7" (UID: "48c77a3e-5181-4e8a-b896-16d6b80971a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.695620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48c77a3e-5181-4e8a-b896-16d6b80971a7" (UID: "48c77a3e-5181-4e8a-b896-16d6b80971a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.722879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data" (OuterVolumeSpecName: "config-data") pod "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" (UID: "283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.753692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-config-data" (OuterVolumeSpecName: "config-data") pod "6808d614-6634-4b2a-9e78-7480a0921415" (UID: "6808d614-6634-4b2a-9e78-7480a0921415"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.762349 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6808d614-6634-4b2a-9e78-7480a0921415-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.762461 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.762585 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:48 crc kubenswrapper[4707]: I0129 03:46:48.762599 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c77a3e-5181-4e8a-b896-16d6b80971a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.079937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" event={"ID":"283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa","Type":"ContainerDied","Data":"e7911a5a16c4b404166c7d373d38bb503eb38b0b13c614e614d7136b05a53391"} Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.080003 4707 scope.go:117] "RemoveContainer" containerID="2736d36134ffeeef1be6185a6f41beb8e55f7f030bd9491c40b9fdeb3ba8a72a" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.079948 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f8cf9f5b7-gdtm6" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.082063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wbtsz" event={"ID":"6808d614-6634-4b2a-9e78-7480a0921415","Type":"ContainerDied","Data":"f3fe3eb0e4c11ab308c643d804a5fad60892d3e16f17e05de146b7dd3a28053f"} Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.082087 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3fe3eb0e4c11ab308c643d804a5fad60892d3e16f17e05de146b7dd3a28053f" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.082152 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wbtsz" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.085804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-744498df46-44wwg" event={"ID":"48c77a3e-5181-4e8a-b896-16d6b80971a7","Type":"ContainerDied","Data":"3b20dd77b8e87b342262d49bd1773d39a5fb18866c6c0cd17defcc452e59f188"} Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.085870 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-744498df46-44wwg" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.132360 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-f8cf9f5b7-gdtm6"] Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.147437 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-f8cf9f5b7-gdtm6"] Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.155937 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-744498df46-44wwg"] Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.167549 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-744498df46-44wwg"] Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.256162 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" path="/var/lib/kubelet/pods/283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa/volumes" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.257031 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c77a3e-5181-4e8a-b896-16d6b80971a7" path="/var/lib/kubelet/pods/48c77a3e-5181-4e8a-b896-16d6b80971a7/volumes" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.641618 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.658909 4707 scope.go:117] "RemoveContainer" containerID="d5a42bbcf90b49ddb0508a4c9e4f39fe661a2fd12a4886410d2c5c1836485352" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.888379 4707 scope.go:117] "RemoveContainer" containerID="d4cba128e6c7b1b7549fa9332dcd7e8e7a2fdd488ae00d0c497da92cfd00eff8" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.905020 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.909023 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:46:49 crc kubenswrapper[4707]: I0129 03:46:49.955843 4707 scope.go:117] "RemoveContainer" containerID="50b3aaa904f5b278cfe02b2ed00c3063c0056a05d58310dba367e2894a99f315" Jan 29 03:46:49 crc kubenswrapper[4707]: E0129 03:46:49.993449 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.001303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-svc\") pod \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.001438 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-combined-ca-bundle\") pod \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002072 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-scripts\") pod \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-sb\") pod \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-config-data\") pod \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002254 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-db-sync-config-data\") pod \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wcws\" (UniqueName: \"kubernetes.io/projected/c6de908f-0466-47f1-9a1f-bd9a306e98ac-kube-api-access-4wcws\") pod \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002335 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-config\") pod \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002418 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-nb\") pod \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-swift-storage-0\") pod \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\" (UID: \"c6de908f-0466-47f1-9a1f-bd9a306e98ac\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx4fg\" (UniqueName: \"kubernetes.io/projected/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-kube-api-access-kx4fg\") pod \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002562 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-etc-machine-id\") pod \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\" (UID: \"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a\") " Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.002988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" (UID: "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.008245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-scripts" (OuterVolumeSpecName: "scripts") pod "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" (UID: "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.009061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6de908f-0466-47f1-9a1f-bd9a306e98ac-kube-api-access-4wcws" (OuterVolumeSpecName: "kube-api-access-4wcws") pod "c6de908f-0466-47f1-9a1f-bd9a306e98ac" (UID: "c6de908f-0466-47f1-9a1f-bd9a306e98ac"). InnerVolumeSpecName "kube-api-access-4wcws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.009173 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-kube-api-access-kx4fg" (OuterVolumeSpecName: "kube-api-access-kx4fg") pod "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" (UID: "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a"). InnerVolumeSpecName "kube-api-access-kx4fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.011160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" (UID: "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.034865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" (UID: "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.052067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c6de908f-0466-47f1-9a1f-bd9a306e98ac" (UID: "c6de908f-0466-47f1-9a1f-bd9a306e98ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.055870 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c6de908f-0466-47f1-9a1f-bd9a306e98ac" (UID: "c6de908f-0466-47f1-9a1f-bd9a306e98ac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.056223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-config-data" (OuterVolumeSpecName: "config-data") pod "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" (UID: "e6b80fc8-a8ca-417d-9f86-d4fb86587f3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.058520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c6de908f-0466-47f1-9a1f-bd9a306e98ac" (UID: "c6de908f-0466-47f1-9a1f-bd9a306e98ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.064074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-config" (OuterVolumeSpecName: "config") pod "c6de908f-0466-47f1-9a1f-bd9a306e98ac" (UID: "c6de908f-0466-47f1-9a1f-bd9a306e98ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.073037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6de908f-0466-47f1-9a1f-bd9a306e98ac" (UID: "c6de908f-0466-47f1-9a1f-bd9a306e98ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.103623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" event={"ID":"c6de908f-0466-47f1-9a1f-bd9a306e98ac","Type":"ContainerDied","Data":"b097362ce1d78d96c4ab613b9046086dc51530f99fd092f631953e07a785ef91"} Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.103718 4707 scope.go:117] "RemoveContainer" containerID="1d84ccd48450f7d7325c0a65191a8ce44d451628148d6e9ef16535f4d32b2bb1" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.103779 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nqp7c" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105070 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105105 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105120 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105130 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105144 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105155 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105167 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wcws\" (UniqueName: \"kubernetes.io/projected/c6de908f-0466-47f1-9a1f-bd9a306e98ac-kube-api-access-4wcws\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105180 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105190 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105200 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6de908f-0466-47f1-9a1f-bd9a306e98ac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105211 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx4fg\" (UniqueName: \"kubernetes.io/projected/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-kube-api-access-kx4fg\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.105220 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.108349 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fef3f370-7e00-4c47-be4a-23a8919e0c89","Type":"ContainerStarted","Data":"ccbb722beb417e709d2860e571bb1f81fa88f2177d8fcca68da73dd5f8be6d62"} Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.108596 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="ceilometer-notification-agent" containerID="cri-o://6b15895a18792bdda14c470f84a8e95e01aa65e1e0c4f766264206f64e735a5b" gracePeriod=30 Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.108903 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.108962 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="sg-core" containerID="cri-o://4ad2bed2ae1b1a4a31120fee2c5d39a110cad6f6f11d5bfa58db6084bab6df69" gracePeriod=30 Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.109038 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="proxy-httpd" containerID="cri-o://ccbb722beb417e709d2860e571bb1f81fa88f2177d8fcca68da73dd5f8be6d62" gracePeriod=30 Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.122024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gzpxq" event={"ID":"e6b80fc8-a8ca-417d-9f86-d4fb86587f3a","Type":"ContainerDied","Data":"f86bd62599e9bffdf6a60b1d3fbe5c8b28df051eab3a359c37f71e7e99a18546"} Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.122072 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f86bd62599e9bffdf6a60b1d3fbe5c8b28df051eab3a359c37f71e7e99a18546" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.122160 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gzpxq" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.158044 4707 scope.go:117] "RemoveContainer" containerID="4cba5a88b930ff177d7f3114c2e2b8e6ab0e091402f14d9b24d35fd3f296a837" Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.165251 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nqp7c"] Jan 29 03:46:50 crc kubenswrapper[4707]: I0129 03:46:50.179944 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nqp7c"] Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.142511 4707 generic.go:334] "Generic (PLEG): container finished" podID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerID="ccbb722beb417e709d2860e571bb1f81fa88f2177d8fcca68da73dd5f8be6d62" exitCode=0 Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.143038 4707 generic.go:334] "Generic (PLEG): container finished" podID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerID="4ad2bed2ae1b1a4a31120fee2c5d39a110cad6f6f11d5bfa58db6084bab6df69" exitCode=2 Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.142589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fef3f370-7e00-4c47-be4a-23a8919e0c89","Type":"ContainerDied","Data":"ccbb722beb417e709d2860e571bb1f81fa88f2177d8fcca68da73dd5f8be6d62"} Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.143101 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fef3f370-7e00-4c47-be4a-23a8919e0c89","Type":"ContainerDied","Data":"4ad2bed2ae1b1a4a31120fee2c5d39a110cad6f6f11d5bfa58db6084bab6df69"} Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.257182 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6de908f-0466-47f1-9a1f-bd9a306e98ac" path="/var/lib/kubelet/pods/c6de908f-0466-47f1-9a1f-bd9a306e98ac/volumes" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.307879 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 03:46:51 crc kubenswrapper[4707]: E0129 03:46:51.308370 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6de908f-0466-47f1-9a1f-bd9a306e98ac" containerName="dnsmasq-dns" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308388 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6de908f-0466-47f1-9a1f-bd9a306e98ac" containerName="dnsmasq-dns" Jan 29 03:46:51 crc kubenswrapper[4707]: E0129 03:46:51.308399 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerName="barbican-keystone-listener" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308406 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerName="barbican-keystone-listener" Jan 29 03:46:51 crc kubenswrapper[4707]: E0129 03:46:51.308418 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerName="barbican-worker-log" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308427 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerName="barbican-worker-log" Jan 29 03:46:51 crc kubenswrapper[4707]: E0129 03:46:51.308440 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6808d614-6634-4b2a-9e78-7480a0921415" containerName="heat-db-sync" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308447 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6808d614-6634-4b2a-9e78-7480a0921415" containerName="heat-db-sync" Jan 29 03:46:51 crc kubenswrapper[4707]: E0129 03:46:51.308472 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6de908f-0466-47f1-9a1f-bd9a306e98ac" containerName="init" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308478 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6de908f-0466-47f1-9a1f-bd9a306e98ac" containerName="init" Jan 29 03:46:51 crc kubenswrapper[4707]: E0129 03:46:51.308489 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" containerName="cinder-db-sync" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308494 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" containerName="cinder-db-sync" Jan 29 03:46:51 crc kubenswrapper[4707]: E0129 03:46:51.308502 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerName="barbican-keystone-listener-log" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308508 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerName="barbican-keystone-listener-log" Jan 29 03:46:51 crc kubenswrapper[4707]: E0129 03:46:51.308525 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerName="barbican-worker" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308534 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerName="barbican-worker" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308726 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" containerName="cinder-db-sync" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308750 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6808d614-6634-4b2a-9e78-7480a0921415" containerName="heat-db-sync" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308759 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerName="barbican-keystone-listener" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308767 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerName="barbican-worker" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308778 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6de908f-0466-47f1-9a1f-bd9a306e98ac" containerName="dnsmasq-dns" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308803 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c77a3e-5181-4e8a-b896-16d6b80971a7" containerName="barbican-keystone-listener-log" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.308813 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="283f3ddb-6cc2-4f66-bbaa-71eb0d66deaa" containerName="barbican-worker-log" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.310080 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.312848 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.313194 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.313381 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tgh4r" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.313580 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.319932 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbzsq"] Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.339290 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.353296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbzsq"] Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.372274 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.434411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.434489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.434542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.434601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.434640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/23408df0-eb02-4602-ada1-f85c0cffb4a5-kube-api-access-rxs45\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.434913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.435087 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-config\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.435167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.435283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.435417 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23408df0-eb02-4602-ada1-f85c0cffb4a5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.435531 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mht99\" (UniqueName: \"kubernetes.io/projected/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-kube-api-access-mht99\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.435612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-scripts\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.506639 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.508624 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.515251 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.535326 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.536870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.536911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.536941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.536965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.536989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/23408df0-eb02-4602-ada1-f85c0cffb4a5-kube-api-access-rxs45\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.537015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.537036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-config\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.537057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.537100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.537142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23408df0-eb02-4602-ada1-f85c0cffb4a5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.537180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mht99\" (UniqueName: \"kubernetes.io/projected/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-kube-api-access-mht99\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.537207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-scripts\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.541963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.543237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.543298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.543396 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23408df0-eb02-4602-ada1-f85c0cffb4a5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.543860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-config\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.544463 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.552448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.556247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.573522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.582706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/23408df0-eb02-4602-ada1-f85c0cffb4a5-kube-api-access-rxs45\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.583437 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mht99\" (UniqueName: \"kubernetes.io/projected/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-kube-api-access-mht99\") pod \"dnsmasq-dns-5c9776ccc5-hbzsq\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.586200 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-scripts\") pod \"cinder-scheduler-0\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.645907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.645989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data-custom\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.646100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e3f825-8130-471a-b673-d42a4077accb-logs\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.646180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlhs5\" (UniqueName: \"kubernetes.io/projected/a1e3f825-8130-471a-b673-d42a4077accb-kube-api-access-zlhs5\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.646198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1e3f825-8130-471a-b673-d42a4077accb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.646214 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-scripts\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.646235 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.646511 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.686138 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.750005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlhs5\" (UniqueName: \"kubernetes.io/projected/a1e3f825-8130-471a-b673-d42a4077accb-kube-api-access-zlhs5\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.750060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1e3f825-8130-471a-b673-d42a4077accb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.750085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-scripts\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.750108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.750127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.750162 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data-custom\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.750243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e3f825-8130-471a-b673-d42a4077accb-logs\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.750900 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e3f825-8130-471a-b673-d42a4077accb-logs\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.751248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1e3f825-8130-471a-b673-d42a4077accb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.758731 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.759459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.763847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data-custom\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.769653 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-scripts\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.784798 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlhs5\" (UniqueName: \"kubernetes.io/projected/a1e3f825-8130-471a-b673-d42a4077accb-kube-api-access-zlhs5\") pod \"cinder-api-0\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " pod="openstack/cinder-api-0" Jan 29 03:46:51 crc kubenswrapper[4707]: I0129 03:46:51.837884 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 03:46:52 crc kubenswrapper[4707]: I0129 03:46:52.344223 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 03:46:52 crc kubenswrapper[4707]: W0129 03:46:52.476169 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1e3f825_8130_471a_b673_d42a4077accb.slice/crio-4966a50703cdd73e5effd81984e418ff33a3d7020f4eb1855d3df4f0b00bed16 WatchSource:0}: Error finding container 4966a50703cdd73e5effd81984e418ff33a3d7020f4eb1855d3df4f0b00bed16: Status 404 returned error can't find the container with id 4966a50703cdd73e5effd81984e418ff33a3d7020f4eb1855d3df4f0b00bed16 Jan 29 03:46:52 crc kubenswrapper[4707]: W0129 03:46:52.479536 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb8317cf_d24b_4fc7_a1d6_dd4f2881bb92.slice/crio-b4c19a3847c6ef6ddac28b3c557a0413d871442d889b16296c5a765d7aff1e65 WatchSource:0}: Error finding container b4c19a3847c6ef6ddac28b3c557a0413d871442d889b16296c5a765d7aff1e65: Status 404 returned error can't find the container with id b4c19a3847c6ef6ddac28b3c557a0413d871442d889b16296c5a765d7aff1e65 Jan 29 03:46:52 crc kubenswrapper[4707]: I0129 03:46:52.481006 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 03:46:52 crc kubenswrapper[4707]: I0129 03:46:52.489646 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbzsq"] Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.180300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a1e3f825-8130-471a-b673-d42a4077accb","Type":"ContainerStarted","Data":"906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec"} Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.180813 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a1e3f825-8130-471a-b673-d42a4077accb","Type":"ContainerStarted","Data":"4966a50703cdd73e5effd81984e418ff33a3d7020f4eb1855d3df4f0b00bed16"} Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.183912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23408df0-eb02-4602-ada1-f85c0cffb4a5","Type":"ContainerStarted","Data":"7ae34194173c025db2dbe80ec302d794091a2af94cb1a696036f52eaf989d47b"} Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.186612 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" containerID="c73244cc348c64a8b54de140e7180144e315f14a6a60e1ccb0321e280bb70416" exitCode=0 Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.186673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" event={"ID":"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92","Type":"ContainerDied","Data":"c73244cc348c64a8b54de140e7180144e315f14a6a60e1ccb0321e280bb70416"} Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.186712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" event={"ID":"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92","Type":"ContainerStarted","Data":"b4c19a3847c6ef6ddac28b3c557a0413d871442d889b16296c5a765d7aff1e65"} Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.307767 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.350925 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cf5fb45fd-lqs99" Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.446997 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-774fc4cdc8-zk6d7"] Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.447317 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-774fc4cdc8-zk6d7" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api-log" containerID="cri-o://575f6d164bfcdce937417b5d5e8e692dae47e61e20ebca2e68051b469d4975c8" gracePeriod=30 Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.448271 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-774fc4cdc8-zk6d7" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api" containerID="cri-o://5da4aca2b346c268062806371ccff5c4ef359dd215e0ac019744dc7bc7333680" gracePeriod=30 Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.479954 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-774fc4cdc8-zk6d7" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": EOF" Jan 29 03:46:53 crc kubenswrapper[4707]: I0129 03:46:53.564198 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 03:46:54 crc kubenswrapper[4707]: I0129 03:46:54.204267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" event={"ID":"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92","Type":"ContainerStarted","Data":"503feb7ea47f72e524427b1131df8bfd7e2d320d9513a05384b1d9c1a6962c52"} Jan 29 03:46:54 crc kubenswrapper[4707]: I0129 03:46:54.204891 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:46:54 crc kubenswrapper[4707]: I0129 03:46:54.207652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23408df0-eb02-4602-ada1-f85c0cffb4a5","Type":"ContainerStarted","Data":"982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964"} Jan 29 03:46:54 crc kubenswrapper[4707]: I0129 03:46:54.224175 4707 generic.go:334] "Generic (PLEG): container finished" podID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerID="575f6d164bfcdce937417b5d5e8e692dae47e61e20ebca2e68051b469d4975c8" exitCode=143 Jan 29 03:46:54 crc kubenswrapper[4707]: I0129 03:46:54.225258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-774fc4cdc8-zk6d7" event={"ID":"be340738-ebb6-4cc0-8942-e6c0f6d59f6a","Type":"ContainerDied","Data":"575f6d164bfcdce937417b5d5e8e692dae47e61e20ebca2e68051b469d4975c8"} Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.236731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a1e3f825-8130-471a-b673-d42a4077accb","Type":"ContainerStarted","Data":"5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c"} Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.237620 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.237193 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a1e3f825-8130-471a-b673-d42a4077accb" containerName="cinder-api-log" containerID="cri-o://906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec" gracePeriod=30 Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.237662 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a1e3f825-8130-471a-b673-d42a4077accb" containerName="cinder-api" containerID="cri-o://5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c" gracePeriod=30 Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.267830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23408df0-eb02-4602-ada1-f85c0cffb4a5","Type":"ContainerStarted","Data":"ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1"} Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.270473 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" podStartSLOduration=4.27044168 podStartE2EDuration="4.27044168s" podCreationTimestamp="2026-01-29 03:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:54.255529497 +0000 UTC m=+1167.739758402" watchObservedRunningTime="2026-01-29 03:46:55.27044168 +0000 UTC m=+1168.754670585" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.270839 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.270835021 podStartE2EDuration="4.270835021s" podCreationTimestamp="2026-01-29 03:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:55.258781418 +0000 UTC m=+1168.743010333" watchObservedRunningTime="2026-01-29 03:46:55.270835021 +0000 UTC m=+1168.755063926" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.274995 4707 generic.go:334] "Generic (PLEG): container finished" podID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerID="6b15895a18792bdda14c470f84a8e95e01aa65e1e0c4f766264206f64e735a5b" exitCode=0 Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.275108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fef3f370-7e00-4c47-be4a-23a8919e0c89","Type":"ContainerDied","Data":"6b15895a18792bdda14c470f84a8e95e01aa65e1e0c4f766264206f64e735a5b"} Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.308970 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.591321407 podStartE2EDuration="4.308932627s" podCreationTimestamp="2026-01-29 03:46:51 +0000 UTC" firstStartedPulling="2026-01-29 03:46:52.353988419 +0000 UTC m=+1165.838217324" lastFinishedPulling="2026-01-29 03:46:53.071599649 +0000 UTC m=+1166.555828544" observedRunningTime="2026-01-29 03:46:55.284109979 +0000 UTC m=+1168.768338884" watchObservedRunningTime="2026-01-29 03:46:55.308932627 +0000 UTC m=+1168.793161532" Jan 29 03:46:55 crc kubenswrapper[4707]: E0129 03:46:55.615943 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1e3f825_8130_471a_b673_d42a4077accb.slice/crio-conmon-5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1e3f825_8130_471a_b673_d42a4077accb.slice/crio-5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c.scope\": RecentStats: unable to find data in memory cache]" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.733602 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.881030 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-combined-ca-bundle\") pod \"fef3f370-7e00-4c47-be4a-23a8919e0c89\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.881474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-run-httpd\") pod \"fef3f370-7e00-4c47-be4a-23a8919e0c89\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.881671 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-config-data\") pod \"fef3f370-7e00-4c47-be4a-23a8919e0c89\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.881694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-sg-core-conf-yaml\") pod \"fef3f370-7e00-4c47-be4a-23a8919e0c89\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.881716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-log-httpd\") pod \"fef3f370-7e00-4c47-be4a-23a8919e0c89\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.881850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-scripts\") pod \"fef3f370-7e00-4c47-be4a-23a8919e0c89\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.881877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8p55\" (UniqueName: \"kubernetes.io/projected/fef3f370-7e00-4c47-be4a-23a8919e0c89-kube-api-access-c8p55\") pod \"fef3f370-7e00-4c47-be4a-23a8919e0c89\" (UID: \"fef3f370-7e00-4c47-be4a-23a8919e0c89\") " Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.883014 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fef3f370-7e00-4c47-be4a-23a8919e0c89" (UID: "fef3f370-7e00-4c47-be4a-23a8919e0c89"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.883186 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fef3f370-7e00-4c47-be4a-23a8919e0c89" (UID: "fef3f370-7e00-4c47-be4a-23a8919e0c89"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.894891 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef3f370-7e00-4c47-be4a-23a8919e0c89-kube-api-access-c8p55" (OuterVolumeSpecName: "kube-api-access-c8p55") pod "fef3f370-7e00-4c47-be4a-23a8919e0c89" (UID: "fef3f370-7e00-4c47-be4a-23a8919e0c89"). InnerVolumeSpecName "kube-api-access-c8p55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.899692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-scripts" (OuterVolumeSpecName: "scripts") pod "fef3f370-7e00-4c47-be4a-23a8919e0c89" (UID: "fef3f370-7e00-4c47-be4a-23a8919e0c89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.942470 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fef3f370-7e00-4c47-be4a-23a8919e0c89" (UID: "fef3f370-7e00-4c47-be4a-23a8919e0c89"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.989337 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.989379 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.989388 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.989398 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8p55\" (UniqueName: \"kubernetes.io/projected/fef3f370-7e00-4c47-be4a-23a8919e0c89-kube-api-access-c8p55\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:55 crc kubenswrapper[4707]: I0129 03:46:55.989408 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fef3f370-7e00-4c47-be4a-23a8919e0c89-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.005761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fef3f370-7e00-4c47-be4a-23a8919e0c89" (UID: "fef3f370-7e00-4c47-be4a-23a8919e0c89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.024526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-config-data" (OuterVolumeSpecName: "config-data") pod "fef3f370-7e00-4c47-be4a-23a8919e0c89" (UID: "fef3f370-7e00-4c47-be4a-23a8919e0c89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.053994 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.091469 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.091513 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fef3f370-7e00-4c47-be4a-23a8919e0c89-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.192462 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-combined-ca-bundle\") pod \"a1e3f825-8130-471a-b673-d42a4077accb\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.192945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data\") pod \"a1e3f825-8130-471a-b673-d42a4077accb\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.192972 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e3f825-8130-471a-b673-d42a4077accb-logs\") pod \"a1e3f825-8130-471a-b673-d42a4077accb\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.193109 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-scripts\") pod \"a1e3f825-8130-471a-b673-d42a4077accb\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.193131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data-custom\") pod \"a1e3f825-8130-471a-b673-d42a4077accb\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.193156 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1e3f825-8130-471a-b673-d42a4077accb-etc-machine-id\") pod \"a1e3f825-8130-471a-b673-d42a4077accb\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.193198 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlhs5\" (UniqueName: \"kubernetes.io/projected/a1e3f825-8130-471a-b673-d42a4077accb-kube-api-access-zlhs5\") pod \"a1e3f825-8130-471a-b673-d42a4077accb\" (UID: \"a1e3f825-8130-471a-b673-d42a4077accb\") " Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.194690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e3f825-8130-471a-b673-d42a4077accb-logs" (OuterVolumeSpecName: "logs") pod "a1e3f825-8130-471a-b673-d42a4077accb" (UID: "a1e3f825-8130-471a-b673-d42a4077accb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.194723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1e3f825-8130-471a-b673-d42a4077accb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a1e3f825-8130-471a-b673-d42a4077accb" (UID: "a1e3f825-8130-471a-b673-d42a4077accb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.202010 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a1e3f825-8130-471a-b673-d42a4077accb" (UID: "a1e3f825-8130-471a-b673-d42a4077accb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.215874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e3f825-8130-471a-b673-d42a4077accb-kube-api-access-zlhs5" (OuterVolumeSpecName: "kube-api-access-zlhs5") pod "a1e3f825-8130-471a-b673-d42a4077accb" (UID: "a1e3f825-8130-471a-b673-d42a4077accb"). InnerVolumeSpecName "kube-api-access-zlhs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.226779 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-scripts" (OuterVolumeSpecName: "scripts") pod "a1e3f825-8130-471a-b673-d42a4077accb" (UID: "a1e3f825-8130-471a-b673-d42a4077accb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.242761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1e3f825-8130-471a-b673-d42a4077accb" (UID: "a1e3f825-8130-471a-b673-d42a4077accb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.282083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data" (OuterVolumeSpecName: "config-data") pod "a1e3f825-8130-471a-b673-d42a4077accb" (UID: "a1e3f825-8130-471a-b673-d42a4077accb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297090 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297121 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e3f825-8130-471a-b673-d42a4077accb-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297130 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297140 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297152 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1e3f825-8130-471a-b673-d42a4077accb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297161 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlhs5\" (UniqueName: \"kubernetes.io/projected/a1e3f825-8130-471a-b673-d42a4077accb-kube-api-access-zlhs5\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297172 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e3f825-8130-471a-b673-d42a4077accb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297654 4707 generic.go:334] "Generic (PLEG): container finished" podID="a1e3f825-8130-471a-b673-d42a4077accb" containerID="5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c" exitCode=0 Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297712 4707 generic.go:334] "Generic (PLEG): container finished" podID="a1e3f825-8130-471a-b673-d42a4077accb" containerID="906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec" exitCode=143 Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.297838 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.300648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a1e3f825-8130-471a-b673-d42a4077accb","Type":"ContainerDied","Data":"5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c"} Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.300731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a1e3f825-8130-471a-b673-d42a4077accb","Type":"ContainerDied","Data":"906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec"} Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.300744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a1e3f825-8130-471a-b673-d42a4077accb","Type":"ContainerDied","Data":"4966a50703cdd73e5effd81984e418ff33a3d7020f4eb1855d3df4f0b00bed16"} Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.300763 4707 scope.go:117] "RemoveContainer" containerID="5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.313177 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.323816 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fef3f370-7e00-4c47-be4a-23a8919e0c89","Type":"ContainerDied","Data":"93111b29842fc06a3a387200a23c87704f034a5a4cc6ab30d960b0da11915186"} Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.384047 4707 scope.go:117] "RemoveContainer" containerID="906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.407273 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.420438 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.445629 4707 scope.go:117] "RemoveContainer" containerID="5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c" Jan 29 03:46:56 crc kubenswrapper[4707]: E0129 03:46:56.456935 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c\": container with ID starting with 5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c not found: ID does not exist" containerID="5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.457067 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c"} err="failed to get container status \"5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c\": rpc error: code = NotFound desc = could not find container \"5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c\": container with ID starting with 5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c not found: ID does not exist" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.457110 4707 scope.go:117] "RemoveContainer" containerID="906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec" Jan 29 03:46:56 crc kubenswrapper[4707]: E0129 03:46:56.457970 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec\": container with ID starting with 906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec not found: ID does not exist" containerID="906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.457995 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec"} err="failed to get container status \"906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec\": rpc error: code = NotFound desc = could not find container \"906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec\": container with ID starting with 906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec not found: ID does not exist" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.458009 4707 scope.go:117] "RemoveContainer" containerID="5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.464092 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c"} err="failed to get container status \"5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c\": rpc error: code = NotFound desc = could not find container \"5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c\": container with ID starting with 5348649bf08dcd3b346a5594d3bb77f75726ba73da01813531360a59b9988b1c not found: ID does not exist" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.464161 4707 scope.go:117] "RemoveContainer" containerID="906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.467894 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec"} err="failed to get container status \"906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec\": rpc error: code = NotFound desc = could not find container \"906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec\": container with ID starting with 906a995c8b3df1369cae8e74a93464eb7b87e5da6de8fb7fffa06622cdbacdec not found: ID does not exist" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.467937 4707 scope.go:117] "RemoveContainer" containerID="ccbb722beb417e709d2860e571bb1f81fa88f2177d8fcca68da73dd5f8be6d62" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.531674 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.541684 4707 scope.go:117] "RemoveContainer" containerID="4ad2bed2ae1b1a4a31120fee2c5d39a110cad6f6f11d5bfa58db6084bab6df69" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.568687 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.580523 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 29 03:46:56 crc kubenswrapper[4707]: E0129 03:46:56.581046 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e3f825-8130-471a-b673-d42a4077accb" containerName="cinder-api" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581069 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e3f825-8130-471a-b673-d42a4077accb" containerName="cinder-api" Jan 29 03:46:56 crc kubenswrapper[4707]: E0129 03:46:56.581093 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="ceilometer-notification-agent" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581099 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="ceilometer-notification-agent" Jan 29 03:46:56 crc kubenswrapper[4707]: E0129 03:46:56.581112 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e3f825-8130-471a-b673-d42a4077accb" containerName="cinder-api-log" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581118 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e3f825-8130-471a-b673-d42a4077accb" containerName="cinder-api-log" Jan 29 03:46:56 crc kubenswrapper[4707]: E0129 03:46:56.581139 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="proxy-httpd" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581146 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="proxy-httpd" Jan 29 03:46:56 crc kubenswrapper[4707]: E0129 03:46:56.581157 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="sg-core" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581164 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="sg-core" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581354 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="sg-core" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581372 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e3f825-8130-471a-b673-d42a4077accb" containerName="cinder-api" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581380 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e3f825-8130-471a-b673-d42a4077accb" containerName="cinder-api-log" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581398 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="proxy-httpd" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.581408 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" containerName="ceilometer-notification-agent" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.582472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.587180 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.593640 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.598706 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.598946 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.618296 4707 scope.go:117] "RemoveContainer" containerID="6b15895a18792bdda14c470f84a8e95e01aa65e1e0c4f766264206f64e735a5b" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.621673 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.624271 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.627913 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.631926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.634817 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.650951 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btszn\" (UniqueName: \"kubernetes.io/projected/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-kube-api-access-btszn\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706505 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-config-data-custom\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706552 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706583 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msdkr\" (UniqueName: \"kubernetes.io/projected/32831b02-3e9c-449c-82ce-e134b50ceec4-kube-api-access-msdkr\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706621 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-scripts\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706709 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-run-httpd\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-logs\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706830 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-config-data\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-scripts\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706871 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-config-data\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-log-httpd\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.706933 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btszn\" (UniqueName: \"kubernetes.io/projected/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-kube-api-access-btszn\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-config-data-custom\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808403 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msdkr\" (UniqueName: \"kubernetes.io/projected/32831b02-3e9c-449c-82ce-e134b50ceec4-kube-api-access-msdkr\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-scripts\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808522 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-run-httpd\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-logs\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808589 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-config-data\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-scripts\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808810 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-config-data\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-log-httpd\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.808966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.809424 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-run-httpd\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.809760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-logs\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.810356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-log-httpd\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.815180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-config-data-custom\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.815985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.816891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-scripts\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.817769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-scripts\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.837162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.845038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msdkr\" (UniqueName: \"kubernetes.io/projected/32831b02-3e9c-449c-82ce-e134b50ceec4-kube-api-access-msdkr\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.846345 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-config-data\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.846794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " pod="openstack/ceilometer-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.847528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-config-data\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.848386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btszn\" (UniqueName: \"kubernetes.io/projected/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-kube-api-access-btszn\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.850242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.850823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b270d5b9-7b1a-44b2-b915-4f63e06a10eb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b270d5b9-7b1a-44b2-b915-4f63e06a10eb\") " pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.982687 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 03:46:56 crc kubenswrapper[4707]: I0129 03:46:56.999523 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.040418 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.280643 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1e3f825-8130-471a-b673-d42a4077accb" path="/var/lib/kubelet/pods/a1e3f825-8130-471a-b673-d42a4077accb/volumes" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.282457 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fef3f370-7e00-4c47-be4a-23a8919e0c89" path="/var/lib/kubelet/pods/fef3f370-7e00-4c47-be4a-23a8919e0c89/volumes" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.297626 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6694d87b67-rdpz4"] Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.298294 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6694d87b67-rdpz4" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-httpd" containerID="cri-o://5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532" gracePeriod=30 Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.298104 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6694d87b67-rdpz4" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-api" containerID="cri-o://eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389" gracePeriod=30 Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.329567 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f8ffb664f-gwtlc"] Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.331688 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.356286 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f8ffb664f-gwtlc"] Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.407857 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6694d87b67-rdpz4" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": read tcp 10.217.0.2:55358->10.217.0.157:9696: read: connection reset by peer" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.428591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkxzn\" (UniqueName: \"kubernetes.io/projected/4d553753-4701-4a28-81dd-f7d0fbe719d6-kube-api-access-jkxzn\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.428909 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-combined-ca-bundle\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.429228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-config\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.429342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-public-tls-certs\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.429466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-ovndb-tls-certs\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.429508 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-httpd-config\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.429600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-internal-tls-certs\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.532075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-internal-tls-certs\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.532602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkxzn\" (UniqueName: \"kubernetes.io/projected/4d553753-4701-4a28-81dd-f7d0fbe719d6-kube-api-access-jkxzn\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.533142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-combined-ca-bundle\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.533235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-config\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.533282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-public-tls-certs\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.533331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-ovndb-tls-certs\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.533362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-httpd-config\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.539927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-internal-tls-certs\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.545643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-config\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.545821 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-ovndb-tls-certs\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.555278 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkxzn\" (UniqueName: \"kubernetes.io/projected/4d553753-4701-4a28-81dd-f7d0fbe719d6-kube-api-access-jkxzn\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.561564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-httpd-config\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.562056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-combined-ca-bundle\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.562525 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d553753-4701-4a28-81dd-f7d0fbe719d6-public-tls-certs\") pod \"neutron-f8ffb664f-gwtlc\" (UID: \"4d553753-4701-4a28-81dd-f7d0fbe719d6\") " pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.684220 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.751732 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.868975 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.939733 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-774fc4cdc8-zk6d7" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:50376->10.217.0.165:9311: read: connection reset by peer" Jan 29 03:46:57 crc kubenswrapper[4707]: I0129 03:46:57.940116 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-774fc4cdc8-zk6d7" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:50378->10.217.0.165:9311: read: connection reset by peer" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.362528 4707 generic.go:334] "Generic (PLEG): container finished" podID="2500de98-6ed6-4399-889c-a397807fcd52" containerID="5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532" exitCode=0 Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.362602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6694d87b67-rdpz4" event={"ID":"2500de98-6ed6-4399-889c-a397807fcd52","Type":"ContainerDied","Data":"5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532"} Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.365171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerStarted","Data":"b05d3cfa3c3360dbd3163cb35da1d7a92af3d8b94c178a37e158f30bf93a5b6b"} Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.366928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b270d5b9-7b1a-44b2-b915-4f63e06a10eb","Type":"ContainerStarted","Data":"8f3d3c2643e562a375f15b86548c85757598f066fed36ec6f446795ee4b13bc5"} Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.368885 4707 generic.go:334] "Generic (PLEG): container finished" podID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerID="5da4aca2b346c268062806371ccff5c4ef359dd215e0ac019744dc7bc7333680" exitCode=0 Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.368919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-774fc4cdc8-zk6d7" event={"ID":"be340738-ebb6-4cc0-8942-e6c0f6d59f6a","Type":"ContainerDied","Data":"5da4aca2b346c268062806371ccff5c4ef359dd215e0ac019744dc7bc7333680"} Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.368936 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-774fc4cdc8-zk6d7" event={"ID":"be340738-ebb6-4cc0-8942-e6c0f6d59f6a","Type":"ContainerDied","Data":"157bbf20712c650909465337fd0350d744656ffefbf421e4bc5d40aeaaebc80f"} Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.368947 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="157bbf20712c650909465337fd0350d744656ffefbf421e4bc5d40aeaaebc80f" Jan 29 03:46:58 crc kubenswrapper[4707]: W0129 03:46:58.380380 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d553753_4701_4a28_81dd_f7d0fbe719d6.slice/crio-ca6d3ea370a6d43324491cadf5e539562fce1a1525411e68f54928a02d3005e2 WatchSource:0}: Error finding container ca6d3ea370a6d43324491cadf5e539562fce1a1525411e68f54928a02d3005e2: Status 404 returned error can't find the container with id ca6d3ea370a6d43324491cadf5e539562fce1a1525411e68f54928a02d3005e2 Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.387918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f8ffb664f-gwtlc"] Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.579220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.685820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data\") pod \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.685926 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data-custom\") pod \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.686070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plbj9\" (UniqueName: \"kubernetes.io/projected/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-kube-api-access-plbj9\") pod \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.686110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-logs\") pod \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.686161 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-combined-ca-bundle\") pod \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\" (UID: \"be340738-ebb6-4cc0-8942-e6c0f6d59f6a\") " Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.687854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-logs" (OuterVolumeSpecName: "logs") pod "be340738-ebb6-4cc0-8942-e6c0f6d59f6a" (UID: "be340738-ebb6-4cc0-8942-e6c0f6d59f6a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.693613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be340738-ebb6-4cc0-8942-e6c0f6d59f6a" (UID: "be340738-ebb6-4cc0-8942-e6c0f6d59f6a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.695871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-kube-api-access-plbj9" (OuterVolumeSpecName: "kube-api-access-plbj9") pod "be340738-ebb6-4cc0-8942-e6c0f6d59f6a" (UID: "be340738-ebb6-4cc0-8942-e6c0f6d59f6a"). InnerVolumeSpecName "kube-api-access-plbj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.766547 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be340738-ebb6-4cc0-8942-e6c0f6d59f6a" (UID: "be340738-ebb6-4cc0-8942-e6c0f6d59f6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.788672 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.788701 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plbj9\" (UniqueName: \"kubernetes.io/projected/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-kube-api-access-plbj9\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.788716 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.788725 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.796937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data" (OuterVolumeSpecName: "config-data") pod "be340738-ebb6-4cc0-8942-e6c0f6d59f6a" (UID: "be340738-ebb6-4cc0-8942-e6c0f6d59f6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:46:58 crc kubenswrapper[4707]: I0129 03:46:58.891217 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be340738-ebb6-4cc0-8942-e6c0f6d59f6a-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.401206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerStarted","Data":"1118eb4b1e4171cb3d663e7f4a87b7c13f51a7f00ab7659e586d0b5c2c518d12"} Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.403163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b270d5b9-7b1a-44b2-b915-4f63e06a10eb","Type":"ContainerStarted","Data":"79c3211b636ecc7eeb563a7bd63f3df94a3951c97a5ff1d0bf37b8dcef34b3d5"} Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.405021 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-774fc4cdc8-zk6d7" Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.407049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8ffb664f-gwtlc" event={"ID":"4d553753-4701-4a28-81dd-f7d0fbe719d6","Type":"ContainerStarted","Data":"2c2be1bb6b11d4bdce1a11178834a11c3a626a424058cd1860c30c3c4ef54c48"} Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.407075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8ffb664f-gwtlc" event={"ID":"4d553753-4701-4a28-81dd-f7d0fbe719d6","Type":"ContainerStarted","Data":"a484d05032f91d94c102d85e06c0fe07da29f86e6fa386bb4ba986139eee2dee"} Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.407086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8ffb664f-gwtlc" event={"ID":"4d553753-4701-4a28-81dd-f7d0fbe719d6","Type":"ContainerStarted","Data":"ca6d3ea370a6d43324491cadf5e539562fce1a1525411e68f54928a02d3005e2"} Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.407101 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.451759 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f8ffb664f-gwtlc" podStartSLOduration=2.451733156 podStartE2EDuration="2.451733156s" podCreationTimestamp="2026-01-29 03:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:46:59.443704697 +0000 UTC m=+1172.927933692" watchObservedRunningTime="2026-01-29 03:46:59.451733156 +0000 UTC m=+1172.935962061" Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.476892 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-774fc4cdc8-zk6d7"] Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.488474 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-774fc4cdc8-zk6d7"] Jan 29 03:46:59 crc kubenswrapper[4707]: I0129 03:46:59.594219 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6694d87b67-rdpz4" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Jan 29 03:47:00 crc kubenswrapper[4707]: I0129 03:47:00.416826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b270d5b9-7b1a-44b2-b915-4f63e06a10eb","Type":"ContainerStarted","Data":"5db4e1c3a9be8c751f222185a62ea930b53e11af85dfc4694c7f3a84cf55af3e"} Jan 29 03:47:00 crc kubenswrapper[4707]: I0129 03:47:00.417359 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 29 03:47:00 crc kubenswrapper[4707]: I0129 03:47:00.419169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerStarted","Data":"3697beb5a38c159a124a16293b7b50a54fa83bbe07926ebb3ff3967c0594b670"} Jan 29 03:47:00 crc kubenswrapper[4707]: I0129 03:47:00.462176 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.4621477 podStartE2EDuration="4.4621477s" podCreationTimestamp="2026-01-29 03:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:00.454849732 +0000 UTC m=+1173.939078637" watchObservedRunningTime="2026-01-29 03:47:00.4621477 +0000 UTC m=+1173.946376605" Jan 29 03:47:01 crc kubenswrapper[4707]: I0129 03:47:01.256424 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" path="/var/lib/kubelet/pods/be340738-ebb6-4cc0-8942-e6c0f6d59f6a/volumes" Jan 29 03:47:01 crc kubenswrapper[4707]: I0129 03:47:01.432662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerStarted","Data":"fe5b8da807d053532b8d5b7a876092045cce18f1599255f3fdb23be11a472a13"} Jan 29 03:47:01 crc kubenswrapper[4707]: I0129 03:47:01.688769 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:47:01 crc kubenswrapper[4707]: I0129 03:47:01.817310 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-pj4lj"] Jan 29 03:47:01 crc kubenswrapper[4707]: I0129 03:47:01.817691 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" podUID="07580a17-d3c5-4103-a8a5-cb85569104ec" containerName="dnsmasq-dns" containerID="cri-o://1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57" gracePeriod=10 Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.058071 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.164926 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.421042 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.447121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerStarted","Data":"603e5fb1c5b5f4515460a4762b0fb2e3e2cda34cdf077e1429bc2e7e85fa3436"} Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.448515 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.457342 4707 generic.go:334] "Generic (PLEG): container finished" podID="07580a17-d3c5-4103-a8a5-cb85569104ec" containerID="1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57" exitCode=0 Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.457422 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.457416 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" event={"ID":"07580a17-d3c5-4103-a8a5-cb85569104ec","Type":"ContainerDied","Data":"1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57"} Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.457515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-pj4lj" event={"ID":"07580a17-d3c5-4103-a8a5-cb85569104ec","Type":"ContainerDied","Data":"4bbe2cf824179ad7fb088a0d81e46e30cdfff3ee1f893b9fc718b5e8f9e2c915"} Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.457559 4707 scope.go:117] "RemoveContainer" containerID="1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.457853 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerName="cinder-scheduler" containerID="cri-o://982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964" gracePeriod=30 Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.457859 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerName="probe" containerID="cri-o://ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1" gracePeriod=30 Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.488321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jhw4\" (UniqueName: \"kubernetes.io/projected/07580a17-d3c5-4103-a8a5-cb85569104ec-kube-api-access-6jhw4\") pod \"07580a17-d3c5-4103-a8a5-cb85569104ec\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.488374 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-nb\") pod \"07580a17-d3c5-4103-a8a5-cb85569104ec\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.488493 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-config\") pod \"07580a17-d3c5-4103-a8a5-cb85569104ec\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.488590 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-sb\") pod \"07580a17-d3c5-4103-a8a5-cb85569104ec\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.488692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-swift-storage-0\") pod \"07580a17-d3c5-4103-a8a5-cb85569104ec\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.488785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-svc\") pod \"07580a17-d3c5-4103-a8a5-cb85569104ec\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.498094 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.557009558 podStartE2EDuration="6.498070348s" podCreationTimestamp="2026-01-29 03:46:56 +0000 UTC" firstStartedPulling="2026-01-29 03:46:57.914141618 +0000 UTC m=+1171.398370523" lastFinishedPulling="2026-01-29 03:47:01.855202408 +0000 UTC m=+1175.339431313" observedRunningTime="2026-01-29 03:47:02.47706741 +0000 UTC m=+1175.961296315" watchObservedRunningTime="2026-01-29 03:47:02.498070348 +0000 UTC m=+1175.982299253" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.499030 4707 scope.go:117] "RemoveContainer" containerID="d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.500888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07580a17-d3c5-4103-a8a5-cb85569104ec-kube-api-access-6jhw4" (OuterVolumeSpecName: "kube-api-access-6jhw4") pod "07580a17-d3c5-4103-a8a5-cb85569104ec" (UID: "07580a17-d3c5-4103-a8a5-cb85569104ec"). InnerVolumeSpecName "kube-api-access-6jhw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.531219 4707 scope.go:117] "RemoveContainer" containerID="1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57" Jan 29 03:47:02 crc kubenswrapper[4707]: E0129 03:47:02.531785 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57\": container with ID starting with 1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57 not found: ID does not exist" containerID="1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.531830 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57"} err="failed to get container status \"1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57\": rpc error: code = NotFound desc = could not find container \"1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57\": container with ID starting with 1f5cb03abecf40592ea11ed9c1b00f50fde0170956a21876769be56501c64f57 not found: ID does not exist" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.531857 4707 scope.go:117] "RemoveContainer" containerID="d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61" Jan 29 03:47:02 crc kubenswrapper[4707]: E0129 03:47:02.535702 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61\": container with ID starting with d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61 not found: ID does not exist" containerID="d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.535764 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61"} err="failed to get container status \"d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61\": rpc error: code = NotFound desc = could not find container \"d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61\": container with ID starting with d0af23fb9ddeb6b8a57cda748c44027520f22a57c5f32d9d342c3946dbd0ba61 not found: ID does not exist" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.546559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "07580a17-d3c5-4103-a8a5-cb85569104ec" (UID: "07580a17-d3c5-4103-a8a5-cb85569104ec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.549283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-config" (OuterVolumeSpecName: "config") pod "07580a17-d3c5-4103-a8a5-cb85569104ec" (UID: "07580a17-d3c5-4103-a8a5-cb85569104ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:02 crc kubenswrapper[4707]: E0129 03:47:02.562975 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-nb podName:07580a17-d3c5-4103-a8a5-cb85569104ec nodeName:}" failed. No retries permitted until 2026-01-29 03:47:03.062932387 +0000 UTC m=+1176.547161292 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-nb") pod "07580a17-d3c5-4103-a8a5-cb85569104ec" (UID: "07580a17-d3c5-4103-a8a5-cb85569104ec") : error deleting /var/lib/kubelet/pods/07580a17-d3c5-4103-a8a5-cb85569104ec/volume-subpaths: remove /var/lib/kubelet/pods/07580a17-d3c5-4103-a8a5-cb85569104ec/volume-subpaths: no such file or directory Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.563274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07580a17-d3c5-4103-a8a5-cb85569104ec" (UID: "07580a17-d3c5-4103-a8a5-cb85569104ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.563391 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07580a17-d3c5-4103-a8a5-cb85569104ec" (UID: "07580a17-d3c5-4103-a8a5-cb85569104ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.590904 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.590942 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.590957 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.590971 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:02 crc kubenswrapper[4707]: I0129 03:47:02.590979 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jhw4\" (UniqueName: \"kubernetes.io/projected/07580a17-d3c5-4103-a8a5-cb85569104ec-kube-api-access-6jhw4\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.101212 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-nb\") pod \"07580a17-d3c5-4103-a8a5-cb85569104ec\" (UID: \"07580a17-d3c5-4103-a8a5-cb85569104ec\") " Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.102506 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07580a17-d3c5-4103-a8a5-cb85569104ec" (UID: "07580a17-d3c5-4103-a8a5-cb85569104ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.208974 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07580a17-d3c5-4103-a8a5-cb85569104ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.293217 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.310758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-ovndb-tls-certs\") pod \"2500de98-6ed6-4399-889c-a397807fcd52\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.310881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-config\") pod \"2500de98-6ed6-4399-889c-a397807fcd52\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.310914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-public-tls-certs\") pod \"2500de98-6ed6-4399-889c-a397807fcd52\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.310958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-internal-tls-certs\") pod \"2500de98-6ed6-4399-889c-a397807fcd52\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.311072 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-combined-ca-bundle\") pod \"2500de98-6ed6-4399-889c-a397807fcd52\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.311092 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsjlz\" (UniqueName: \"kubernetes.io/projected/2500de98-6ed6-4399-889c-a397807fcd52-kube-api-access-qsjlz\") pod \"2500de98-6ed6-4399-889c-a397807fcd52\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.311166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-httpd-config\") pod \"2500de98-6ed6-4399-889c-a397807fcd52\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.326464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2500de98-6ed6-4399-889c-a397807fcd52-kube-api-access-qsjlz" (OuterVolumeSpecName: "kube-api-access-qsjlz") pod "2500de98-6ed6-4399-889c-a397807fcd52" (UID: "2500de98-6ed6-4399-889c-a397807fcd52"). InnerVolumeSpecName "kube-api-access-qsjlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.326701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2500de98-6ed6-4399-889c-a397807fcd52" (UID: "2500de98-6ed6-4399-889c-a397807fcd52"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.386865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2500de98-6ed6-4399-889c-a397807fcd52" (UID: "2500de98-6ed6-4399-889c-a397807fcd52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.399202 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-pj4lj"] Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.406935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-config" (OuterVolumeSpecName: "config") pod "2500de98-6ed6-4399-889c-a397807fcd52" (UID: "2500de98-6ed6-4399-889c-a397807fcd52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.410254 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-pj4lj"] Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.415161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2500de98-6ed6-4399-889c-a397807fcd52" (UID: "2500de98-6ed6-4399-889c-a397807fcd52"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.415850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-public-tls-certs\") pod \"2500de98-6ed6-4399-889c-a397807fcd52\" (UID: \"2500de98-6ed6-4399-889c-a397807fcd52\") " Jan 29 03:47:03 crc kubenswrapper[4707]: W0129 03:47:03.415987 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2500de98-6ed6-4399-889c-a397807fcd52/volumes/kubernetes.io~secret/public-tls-certs Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.416008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2500de98-6ed6-4399-889c-a397807fcd52" (UID: "2500de98-6ed6-4399-889c-a397807fcd52"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.416475 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.416553 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsjlz\" (UniqueName: \"kubernetes.io/projected/2500de98-6ed6-4399-889c-a397807fcd52-kube-api-access-qsjlz\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.416614 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.416666 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.416715 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.438609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2500de98-6ed6-4399-889c-a397807fcd52" (UID: "2500de98-6ed6-4399-889c-a397807fcd52"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.441555 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2500de98-6ed6-4399-889c-a397807fcd52" (UID: "2500de98-6ed6-4399-889c-a397807fcd52"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.470460 4707 generic.go:334] "Generic (PLEG): container finished" podID="2500de98-6ed6-4399-889c-a397807fcd52" containerID="eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389" exitCode=0 Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.470569 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6694d87b67-rdpz4" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.470572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6694d87b67-rdpz4" event={"ID":"2500de98-6ed6-4399-889c-a397807fcd52","Type":"ContainerDied","Data":"eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389"} Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.475326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6694d87b67-rdpz4" event={"ID":"2500de98-6ed6-4399-889c-a397807fcd52","Type":"ContainerDied","Data":"fd20d2208d0a082f1870b295201ad1c8cd74894739b8430f27de978823c459b9"} Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.475363 4707 scope.go:117] "RemoveContainer" containerID="5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.481859 4707 generic.go:334] "Generic (PLEG): container finished" podID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerID="ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1" exitCode=0 Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.481972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23408df0-eb02-4602-ada1-f85c0cffb4a5","Type":"ContainerDied","Data":"ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1"} Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.498869 4707 scope.go:117] "RemoveContainer" containerID="eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.518713 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6694d87b67-rdpz4"] Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.519157 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.519193 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2500de98-6ed6-4399-889c-a397807fcd52-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.525557 4707 scope.go:117] "RemoveContainer" containerID="5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532" Jan 29 03:47:03 crc kubenswrapper[4707]: E0129 03:47:03.526343 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532\": container with ID starting with 5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532 not found: ID does not exist" containerID="5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.526402 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532"} err="failed to get container status \"5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532\": rpc error: code = NotFound desc = could not find container \"5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532\": container with ID starting with 5ce469b31f223df3e69ebd5129e7629f2444bf0622236e7e941ee487f0f46532 not found: ID does not exist" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.526436 4707 scope.go:117] "RemoveContainer" containerID="eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389" Jan 29 03:47:03 crc kubenswrapper[4707]: E0129 03:47:03.526967 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389\": container with ID starting with eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389 not found: ID does not exist" containerID="eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.527020 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389"} err="failed to get container status \"eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389\": rpc error: code = NotFound desc = could not find container \"eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389\": container with ID starting with eba333c051ceeaf3120096bd480883c96412b4df3cf1f0ba543aaefca33dd389 not found: ID does not exist" Jan 29 03:47:03 crc kubenswrapper[4707]: I0129 03:47:03.527176 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6694d87b67-rdpz4"] Jan 29 03:47:05 crc kubenswrapper[4707]: I0129 03:47:05.275820 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07580a17-d3c5-4103-a8a5-cb85569104ec" path="/var/lib/kubelet/pods/07580a17-d3c5-4103-a8a5-cb85569104ec/volumes" Jan 29 03:47:05 crc kubenswrapper[4707]: I0129 03:47:05.277497 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2500de98-6ed6-4399-889c-a397807fcd52" path="/var/lib/kubelet/pods/2500de98-6ed6-4399-889c-a397807fcd52/volumes" Jan 29 03:47:06 crc kubenswrapper[4707]: I0129 03:47:06.986276 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.009605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data\") pod \"23408df0-eb02-4602-ada1-f85c0cffb4a5\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.009667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-scripts\") pod \"23408df0-eb02-4602-ada1-f85c0cffb4a5\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.009770 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data-custom\") pod \"23408df0-eb02-4602-ada1-f85c0cffb4a5\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.009803 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23408df0-eb02-4602-ada1-f85c0cffb4a5-etc-machine-id\") pod \"23408df0-eb02-4602-ada1-f85c0cffb4a5\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.009830 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/23408df0-eb02-4602-ada1-f85c0cffb4a5-kube-api-access-rxs45\") pod \"23408df0-eb02-4602-ada1-f85c0cffb4a5\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.009942 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23408df0-eb02-4602-ada1-f85c0cffb4a5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "23408df0-eb02-4602-ada1-f85c0cffb4a5" (UID: "23408df0-eb02-4602-ada1-f85c0cffb4a5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.010023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-combined-ca-bundle\") pod \"23408df0-eb02-4602-ada1-f85c0cffb4a5\" (UID: \"23408df0-eb02-4602-ada1-f85c0cffb4a5\") " Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.010492 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23408df0-eb02-4602-ada1-f85c0cffb4a5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.025329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-scripts" (OuterVolumeSpecName: "scripts") pod "23408df0-eb02-4602-ada1-f85c0cffb4a5" (UID: "23408df0-eb02-4602-ada1-f85c0cffb4a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.025363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23408df0-eb02-4602-ada1-f85c0cffb4a5-kube-api-access-rxs45" (OuterVolumeSpecName: "kube-api-access-rxs45") pod "23408df0-eb02-4602-ada1-f85c0cffb4a5" (UID: "23408df0-eb02-4602-ada1-f85c0cffb4a5"). InnerVolumeSpecName "kube-api-access-rxs45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.047712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "23408df0-eb02-4602-ada1-f85c0cffb4a5" (UID: "23408df0-eb02-4602-ada1-f85c0cffb4a5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.115467 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.115502 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.115513 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxs45\" (UniqueName: \"kubernetes.io/projected/23408df0-eb02-4602-ada1-f85c0cffb4a5-kube-api-access-rxs45\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.131310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23408df0-eb02-4602-ada1-f85c0cffb4a5" (UID: "23408df0-eb02-4602-ada1-f85c0cffb4a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.209685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data" (OuterVolumeSpecName: "config-data") pod "23408df0-eb02-4602-ada1-f85c0cffb4a5" (UID: "23408df0-eb02-4602-ada1-f85c0cffb4a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.217495 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.217570 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23408df0-eb02-4602-ada1-f85c0cffb4a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.533095 4707 generic.go:334] "Generic (PLEG): container finished" podID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerID="982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964" exitCode=0 Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.533176 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23408df0-eb02-4602-ada1-f85c0cffb4a5","Type":"ContainerDied","Data":"982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964"} Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.533596 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23408df0-eb02-4602-ada1-f85c0cffb4a5","Type":"ContainerDied","Data":"7ae34194173c025db2dbe80ec302d794091a2af94cb1a696036f52eaf989d47b"} Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.533623 4707 scope.go:117] "RemoveContainer" containerID="ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.533201 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.567450 4707 scope.go:117] "RemoveContainer" containerID="982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.567817 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.577466 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.607237 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.607789 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api-log" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.607804 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api-log" Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.607819 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-api" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.607825 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-api" Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.607856 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerName="probe" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.607865 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerName="probe" Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.607886 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07580a17-d3c5-4103-a8a5-cb85569104ec" containerName="init" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.607893 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="07580a17-d3c5-4103-a8a5-cb85569104ec" containerName="init" Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.607903 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-httpd" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.607909 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-httpd" Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.607922 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerName="cinder-scheduler" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.607928 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerName="cinder-scheduler" Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.607940 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07580a17-d3c5-4103-a8a5-cb85569104ec" containerName="dnsmasq-dns" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.607945 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="07580a17-d3c5-4103-a8a5-cb85569104ec" containerName="dnsmasq-dns" Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.607957 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.607963 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.608128 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.608146 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerName="probe" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.608155 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="07580a17-d3c5-4103-a8a5-cb85569104ec" containerName="dnsmasq-dns" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.608162 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-httpd" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.608172 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2500de98-6ed6-4399-889c-a397807fcd52" containerName="neutron-api" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.608184 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="23408df0-eb02-4602-ada1-f85c0cffb4a5" containerName="cinder-scheduler" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.608192 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="be340738-ebb6-4cc0-8942-e6c0f6d59f6a" containerName="barbican-api-log" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.609198 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.615137 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.625678 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.625726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-scripts\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.625818 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-config-data\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.625841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krxgg\" (UniqueName: \"kubernetes.io/projected/539d5b33-91ee-4790-941f-22c82388ed87-kube-api-access-krxgg\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.625881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/539d5b33-91ee-4790-941f-22c82388ed87-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.625913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.695584 4707 scope.go:117] "RemoveContainer" containerID="ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1" Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.696464 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1\": container with ID starting with ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1 not found: ID does not exist" containerID="ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.696572 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1"} err="failed to get container status \"ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1\": rpc error: code = NotFound desc = could not find container \"ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1\": container with ID starting with ab6c6f94ea1e1a7ab3f4ec897a4736efbf9b88f588fcf4fd8da461e85841b0b1 not found: ID does not exist" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.696630 4707 scope.go:117] "RemoveContainer" containerID="982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964" Jan 29 03:47:07 crc kubenswrapper[4707]: E0129 03:47:07.707530 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964\": container with ID starting with 982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964 not found: ID does not exist" containerID="982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.707680 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964"} err="failed to get container status \"982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964\": rpc error: code = NotFound desc = could not find container \"982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964\": container with ID starting with 982d5ecb30657e30775eaaa456ec3a5b4830839e013c8f2f96a96d42e32d7964 not found: ID does not exist" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.728967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-config-data\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.729068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krxgg\" (UniqueName: \"kubernetes.io/projected/539d5b33-91ee-4790-941f-22c82388ed87-kube-api-access-krxgg\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.729129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/539d5b33-91ee-4790-941f-22c82388ed87-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.729185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.729267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.729296 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-scripts\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.729850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/539d5b33-91ee-4790-941f-22c82388ed87-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.747958 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.751294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-config-data\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.751798 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.794577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-scripts\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.807235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krxgg\" (UniqueName: \"kubernetes.io/projected/539d5b33-91ee-4790-941f-22c82388ed87-kube-api-access-krxgg\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:07 crc kubenswrapper[4707]: I0129 03:47:07.813658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/539d5b33-91ee-4790-941f-22c82388ed87-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"539d5b33-91ee-4790-941f-22c82388ed87\") " pod="openstack/cinder-scheduler-0" Jan 29 03:47:08 crc kubenswrapper[4707]: I0129 03:47:08.006183 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 03:47:08 crc kubenswrapper[4707]: I0129 03:47:08.536950 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 03:47:08 crc kubenswrapper[4707]: W0129 03:47:08.547018 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod539d5b33_91ee_4790_941f_22c82388ed87.slice/crio-98a42767691014e84853feb966ea6a5d7bd8d3af94b84646659c47111f93c7a9 WatchSource:0}: Error finding container 98a42767691014e84853feb966ea6a5d7bd8d3af94b84646659c47111f93c7a9: Status 404 returned error can't find the container with id 98a42767691014e84853feb966ea6a5d7bd8d3af94b84646659c47111f93c7a9 Jan 29 03:47:08 crc kubenswrapper[4707]: I0129 03:47:08.738344 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:47:08 crc kubenswrapper[4707]: I0129 03:47:08.780753 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.026460 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9f459ff7d-tkv2s"] Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.028209 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.050842 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f459ff7d-tkv2s"] Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.177585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-combined-ca-bundle\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.178128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-internal-tls-certs\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.178181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-scripts\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.178231 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-config-data\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.178313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-public-tls-certs\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.178336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-logs\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.178384 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btgcm\" (UniqueName: \"kubernetes.io/projected/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-kube-api-access-btgcm\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.259412 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23408df0-eb02-4602-ada1-f85c0cffb4a5" path="/var/lib/kubelet/pods/23408df0-eb02-4602-ada1-f85c0cffb4a5/volumes" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.279811 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btgcm\" (UniqueName: \"kubernetes.io/projected/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-kube-api-access-btgcm\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.279899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-combined-ca-bundle\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.279946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-internal-tls-certs\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.279971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-scripts\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.280007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-config-data\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.280065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-public-tls-certs\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.280085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-logs\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.280580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-logs\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.284308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-combined-ca-bundle\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.285220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-scripts\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.289120 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-config-data\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.292486 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-internal-tls-certs\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.302344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-public-tls-certs\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.303327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btgcm\" (UniqueName: \"kubernetes.io/projected/2ceecbc6-bf80-4008-80e3-0a43426cf4c6-kube-api-access-btgcm\") pod \"placement-9f459ff7d-tkv2s\" (UID: \"2ceecbc6-bf80-4008-80e3-0a43426cf4c6\") " pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.306227 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6568fdcd45-j5nxz" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.353400 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.587179 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"539d5b33-91ee-4790-941f-22c82388ed87","Type":"ContainerStarted","Data":"73b3b417f0e68a0bfec0770f49a426195410e511ab88a81894313602277c7753"} Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.587613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"539d5b33-91ee-4790-941f-22c82388ed87","Type":"ContainerStarted","Data":"98a42767691014e84853feb966ea6a5d7bd8d3af94b84646659c47111f93c7a9"} Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.594736 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.596472 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.599657 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-pdp25" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.600986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.601232 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.637707 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.692649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.692707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.692741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config-secret\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.693475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbjls\" (UniqueName: \"kubernetes.io/projected/f99c0663-dedc-4e59-88be-7bce23026c24-kube-api-access-lbjls\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.762334 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f459ff7d-tkv2s"] Jan 29 03:47:09 crc kubenswrapper[4707]: W0129 03:47:09.772809 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ceecbc6_bf80_4008_80e3_0a43426cf4c6.slice/crio-6ad586d1542e6e2b15a1c7cfc8d365a935272168747d4967146ad8b3f4531b21 WatchSource:0}: Error finding container 6ad586d1542e6e2b15a1c7cfc8d365a935272168747d4967146ad8b3f4531b21: Status 404 returned error can't find the container with id 6ad586d1542e6e2b15a1c7cfc8d365a935272168747d4967146ad8b3f4531b21 Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.795384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbjls\" (UniqueName: \"kubernetes.io/projected/f99c0663-dedc-4e59-88be-7bce23026c24-kube-api-access-lbjls\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.797642 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.797754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.797855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config-secret\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.799330 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.804075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.804170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config-secret\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.810434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbjls\" (UniqueName: \"kubernetes.io/projected/f99c0663-dedc-4e59-88be-7bce23026c24-kube-api-access-lbjls\") pod \"openstackclient\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.833759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.965357 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.966268 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 03:47:09 crc kubenswrapper[4707]: I0129 03:47:09.981261 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.061623 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.063167 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.065149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.123912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.124122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbc25\" (UniqueName: \"kubernetes.io/projected/f44781c6-75de-479b-b6bb-33bc27d468fa-kube-api-access-zbc25\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.124276 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config-secret\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.146385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: E0129 03:47:10.157268 4707 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 29 03:47:10 crc kubenswrapper[4707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_f99c0663-dedc-4e59-88be-7bce23026c24_0(453513be4cc78df7c9e5842bc9a62346d9b9dbc7361e78d910c847bad26b309c): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"453513be4cc78df7c9e5842bc9a62346d9b9dbc7361e78d910c847bad26b309c" Netns:"/var/run/netns/c486761e-8cea-41e8-9421-4d2aed71ff26" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=453513be4cc78df7c9e5842bc9a62346d9b9dbc7361e78d910c847bad26b309c;K8S_POD_UID=f99c0663-dedc-4e59-88be-7bce23026c24" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/f99c0663-dedc-4e59-88be-7bce23026c24]: expected pod UID "f99c0663-dedc-4e59-88be-7bce23026c24" but got "f44781c6-75de-479b-b6bb-33bc27d468fa" from Kube API Jan 29 03:47:10 crc kubenswrapper[4707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 29 03:47:10 crc kubenswrapper[4707]: > Jan 29 03:47:10 crc kubenswrapper[4707]: E0129 03:47:10.157374 4707 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 29 03:47:10 crc kubenswrapper[4707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_f99c0663-dedc-4e59-88be-7bce23026c24_0(453513be4cc78df7c9e5842bc9a62346d9b9dbc7361e78d910c847bad26b309c): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"453513be4cc78df7c9e5842bc9a62346d9b9dbc7361e78d910c847bad26b309c" Netns:"/var/run/netns/c486761e-8cea-41e8-9421-4d2aed71ff26" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=453513be4cc78df7c9e5842bc9a62346d9b9dbc7361e78d910c847bad26b309c;K8S_POD_UID=f99c0663-dedc-4e59-88be-7bce23026c24" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/f99c0663-dedc-4e59-88be-7bce23026c24]: expected pod UID "f99c0663-dedc-4e59-88be-7bce23026c24" but got "f44781c6-75de-479b-b6bb-33bc27d468fa" from Kube API Jan 29 03:47:10 crc kubenswrapper[4707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 29 03:47:10 crc kubenswrapper[4707]: > pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.253517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.253681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.255607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.255691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbc25\" (UniqueName: \"kubernetes.io/projected/f44781c6-75de-479b-b6bb-33bc27d468fa-kube-api-access-zbc25\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.256207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config-secret\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.257198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.273100 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config-secret\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.280353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbc25\" (UniqueName: \"kubernetes.io/projected/f44781c6-75de-479b-b6bb-33bc27d468fa-kube-api-access-zbc25\") pod \"openstackclient\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.391437 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.642004 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"539d5b33-91ee-4790-941f-22c82388ed87","Type":"ContainerStarted","Data":"fdc7e242fb4316cf4240d0a4a6aa585fc89b2eec64ab5cdf5f83b1edfe72b50c"} Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.653983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.655417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f459ff7d-tkv2s" event={"ID":"2ceecbc6-bf80-4008-80e3-0a43426cf4c6","Type":"ContainerStarted","Data":"ef9af638a16f992f931efa5f62b7d781d5da328fb19a8c447860f1fc9400208e"} Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.655446 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.655456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f459ff7d-tkv2s" event={"ID":"2ceecbc6-bf80-4008-80e3-0a43426cf4c6","Type":"ContainerStarted","Data":"f71d3f7b9e53dab24805249411fbc310b594de08a1512f667c1d2eab034e8fc7"} Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.655465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f459ff7d-tkv2s" event={"ID":"2ceecbc6-bf80-4008-80e3-0a43426cf4c6","Type":"ContainerStarted","Data":"6ad586d1542e6e2b15a1c7cfc8d365a935272168747d4967146ad8b3f4531b21"} Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.655487 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.692108 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.693376 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.693347462 podStartE2EDuration="3.693347462s" podCreationTimestamp="2026-01-29 03:47:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:10.692660633 +0000 UTC m=+1184.176889538" watchObservedRunningTime="2026-01-29 03:47:10.693347462 +0000 UTC m=+1184.177576377" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.736886 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f99c0663-dedc-4e59-88be-7bce23026c24" podUID="f44781c6-75de-479b-b6bb-33bc27d468fa" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.744937 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9f459ff7d-tkv2s" podStartSLOduration=1.7449138720000001 podStartE2EDuration="1.744913872s" podCreationTimestamp="2026-01-29 03:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:10.718159709 +0000 UTC m=+1184.202388624" watchObservedRunningTime="2026-01-29 03:47:10.744913872 +0000 UTC m=+1184.229142777" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.809849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config\") pod \"f99c0663-dedc-4e59-88be-7bce23026c24\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.809924 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbjls\" (UniqueName: \"kubernetes.io/projected/f99c0663-dedc-4e59-88be-7bce23026c24-kube-api-access-lbjls\") pod \"f99c0663-dedc-4e59-88be-7bce23026c24\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.810093 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config-secret\") pod \"f99c0663-dedc-4e59-88be-7bce23026c24\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.810231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-combined-ca-bundle\") pod \"f99c0663-dedc-4e59-88be-7bce23026c24\" (UID: \"f99c0663-dedc-4e59-88be-7bce23026c24\") " Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.813816 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f99c0663-dedc-4e59-88be-7bce23026c24" (UID: "f99c0663-dedc-4e59-88be-7bce23026c24"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.825953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99c0663-dedc-4e59-88be-7bce23026c24-kube-api-access-lbjls" (OuterVolumeSpecName: "kube-api-access-lbjls") pod "f99c0663-dedc-4e59-88be-7bce23026c24" (UID: "f99c0663-dedc-4e59-88be-7bce23026c24"). InnerVolumeSpecName "kube-api-access-lbjls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.837628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f99c0663-dedc-4e59-88be-7bce23026c24" (UID: "f99c0663-dedc-4e59-88be-7bce23026c24"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.840727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f99c0663-dedc-4e59-88be-7bce23026c24" (UID: "f99c0663-dedc-4e59-88be-7bce23026c24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.913321 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.913374 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbjls\" (UniqueName: \"kubernetes.io/projected/f99c0663-dedc-4e59-88be-7bce23026c24-kube-api-access-lbjls\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.913399 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:10 crc kubenswrapper[4707]: I0129 03:47:10.913412 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99c0663-dedc-4e59-88be-7bce23026c24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:11 crc kubenswrapper[4707]: I0129 03:47:11.010978 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 03:47:11 crc kubenswrapper[4707]: I0129 03:47:11.255769 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99c0663-dedc-4e59-88be-7bce23026c24" path="/var/lib/kubelet/pods/f99c0663-dedc-4e59-88be-7bce23026c24/volumes" Jan 29 03:47:11 crc kubenswrapper[4707]: I0129 03:47:11.678978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f44781c6-75de-479b-b6bb-33bc27d468fa","Type":"ContainerStarted","Data":"8e65ca7cbf0d60c41b062ace364d2ce5b29db1153573bfce5cb19ed14dd9c368"} Jan 29 03:47:11 crc kubenswrapper[4707]: I0129 03:47:11.679020 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 03:47:11 crc kubenswrapper[4707]: I0129 03:47:11.692419 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f99c0663-dedc-4e59-88be-7bce23026c24" podUID="f44781c6-75de-479b-b6bb-33bc27d468fa" Jan 29 03:47:13 crc kubenswrapper[4707]: I0129 03:47:13.007373 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.106260 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-79896f576-thgpc"] Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.107498 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.118551 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.118957 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.118957 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-xg84d" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.170818 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79896f576-thgpc"] Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.187430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data-custom\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.187511 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-combined-ca-bundle\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.187551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.187653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s27xr\" (UniqueName: \"kubernetes.io/projected/35911d95-d558-41c9-9c2b-811b60410a49-kube-api-access-s27xr\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.299091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-combined-ca-bundle\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.299190 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.299395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s27xr\" (UniqueName: \"kubernetes.io/projected/35911d95-d558-41c9-9c2b-811b60410a49-kube-api-access-s27xr\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.299458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data-custom\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.370913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data-custom\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.384990 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-combined-ca-bundle\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.392911 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s27xr\" (UniqueName: \"kubernetes.io/projected/35911d95-d558-41c9-9c2b-811b60410a49-kube-api-access-s27xr\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.403973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data\") pod \"heat-engine-79896f576-thgpc\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.431728 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.460615 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-5hptj"] Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.462564 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.544411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.544467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-config\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.544506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.544526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf6k2\" (UniqueName: \"kubernetes.io/projected/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-kube-api-access-zf6k2\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.547615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.547683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.566927 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7bdf7467d4-ptkdm"] Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.568381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.583295 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.601777 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-5hptj"] Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.631907 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5bcfc656b8-jq7ns"] Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.633901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.636986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.648747 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7bdf7467d4-ptkdm"] Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf6k2\" (UniqueName: \"kubernetes.io/projected/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-kube-api-access-zf6k2\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-combined-ca-bundle\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data-custom\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650924 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gkpz\" (UniqueName: \"kubernetes.io/projected/0ec32f0d-8609-4579-abc4-1af5f98df4cf-kube-api-access-4gkpz\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.650975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-config\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.651869 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-config\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.652422 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.653216 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.654009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.654560 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.656740 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5bcfc656b8-jq7ns"] Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.676521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf6k2\" (UniqueName: \"kubernetes.io/projected/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-kube-api-access-zf6k2\") pod \"dnsmasq-dns-7756b9d78c-5hptj\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.753646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-combined-ca-bundle\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.753771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-combined-ca-bundle\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.753862 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data-custom\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.753954 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gkpz\" (UniqueName: \"kubernetes.io/projected/0ec32f0d-8609-4579-abc4-1af5f98df4cf-kube-api-access-4gkpz\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.754038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gz5\" (UniqueName: \"kubernetes.io/projected/ffa0c86e-66be-4f34-851f-4908eb22614f-kube-api-access-m7gz5\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.754117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.754141 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data-custom\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.754244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.762020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data-custom\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.762534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-combined-ca-bundle\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.763531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.778123 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gkpz\" (UniqueName: \"kubernetes.io/projected/0ec32f0d-8609-4579-abc4-1af5f98df4cf-kube-api-access-4gkpz\") pod \"heat-cfnapi-7bdf7467d4-ptkdm\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.856197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7gz5\" (UniqueName: \"kubernetes.io/projected/ffa0c86e-66be-4f34-851f-4908eb22614f-kube-api-access-m7gz5\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.856272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.856295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data-custom\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.856413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-combined-ca-bundle\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.861132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-combined-ca-bundle\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.861724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.861791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data-custom\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.877835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7gz5\" (UniqueName: \"kubernetes.io/projected/ffa0c86e-66be-4f34-851f-4908eb22614f-kube-api-access-m7gz5\") pod \"heat-api-5bcfc656b8-jq7ns\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.905232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.936821 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:14 crc kubenswrapper[4707]: I0129 03:47:14.962023 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.098504 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-67b9cbc75f-dv5cr"] Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.102887 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.107199 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.107507 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.107679 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.129706 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67b9cbc75f-dv5cr"] Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.142714 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79896f576-thgpc"] Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.268686 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-combined-ca-bundle\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.268890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f35f5f-1517-41b4-b354-59fd90d8fea5-run-httpd\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.269626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57f35f5f-1517-41b4-b354-59fd90d8fea5-etc-swift\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.269688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-public-tls-certs\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.269779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-config-data\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.269952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f35f5f-1517-41b4-b354-59fd90d8fea5-log-httpd\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.269999 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9kwd\" (UniqueName: \"kubernetes.io/projected/57f35f5f-1517-41b4-b354-59fd90d8fea5-kube-api-access-z9kwd\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.270179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-internal-tls-certs\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.372398 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-internal-tls-certs\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.372525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-combined-ca-bundle\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.372604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f35f5f-1517-41b4-b354-59fd90d8fea5-run-httpd\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.372660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57f35f5f-1517-41b4-b354-59fd90d8fea5-etc-swift\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.372691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-public-tls-certs\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.372720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-config-data\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.372739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f35f5f-1517-41b4-b354-59fd90d8fea5-log-httpd\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.372758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9kwd\" (UniqueName: \"kubernetes.io/projected/57f35f5f-1517-41b4-b354-59fd90d8fea5-kube-api-access-z9kwd\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.374161 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f35f5f-1517-41b4-b354-59fd90d8fea5-run-httpd\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.374673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f35f5f-1517-41b4-b354-59fd90d8fea5-log-httpd\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.379914 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-internal-tls-certs\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.390397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57f35f5f-1517-41b4-b354-59fd90d8fea5-etc-swift\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.393084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-public-tls-certs\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.393834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-config-data\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.395576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f35f5f-1517-41b4-b354-59fd90d8fea5-combined-ca-bundle\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.398837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9kwd\" (UniqueName: \"kubernetes.io/projected/57f35f5f-1517-41b4-b354-59fd90d8fea5-kube-api-access-z9kwd\") pod \"swift-proxy-67b9cbc75f-dv5cr\" (UID: \"57f35f5f-1517-41b4-b354-59fd90d8fea5\") " pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.448149 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.524876 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-5hptj"] Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.596370 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7bdf7467d4-ptkdm"] Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.713313 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5bcfc656b8-jq7ns"] Jan 29 03:47:15 crc kubenswrapper[4707]: W0129 03:47:15.737485 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffa0c86e_66be_4f34_851f_4908eb22614f.slice/crio-0d00a448fa4e3d0b8a011563b64e623b32b9143cfe469f4028968cae81d389f7 WatchSource:0}: Error finding container 0d00a448fa4e3d0b8a011563b64e623b32b9143cfe469f4028968cae81d389f7: Status 404 returned error can't find the container with id 0d00a448fa4e3d0b8a011563b64e623b32b9143cfe469f4028968cae81d389f7 Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.742273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" event={"ID":"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f","Type":"ContainerStarted","Data":"54532c5691f37a5cce1b5b2e5e0c661153e9047be70efb12c9e1dd3d47ffccd5"} Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.743909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79896f576-thgpc" event={"ID":"35911d95-d558-41c9-9c2b-811b60410a49","Type":"ContainerStarted","Data":"18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc"} Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.743940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79896f576-thgpc" event={"ID":"35911d95-d558-41c9-9c2b-811b60410a49","Type":"ContainerStarted","Data":"82bf2c358008dfeee50ff6eddf66028970320bfb989ec05987614d805393db10"} Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.745628 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.752571 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" event={"ID":"0ec32f0d-8609-4579-abc4-1af5f98df4cf","Type":"ContainerStarted","Data":"527a3d9d334bc033a1a2dcde68929c2df7846106ed5f96e935c5431d1a7b5dee"} Jan 29 03:47:15 crc kubenswrapper[4707]: I0129 03:47:15.798754 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-79896f576-thgpc" podStartSLOduration=1.798728331 podStartE2EDuration="1.798728331s" podCreationTimestamp="2026-01-29 03:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:15.788037867 +0000 UTC m=+1189.272266772" watchObservedRunningTime="2026-01-29 03:47:15.798728331 +0000 UTC m=+1189.282957236" Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.132752 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67b9cbc75f-dv5cr"] Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.321478 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.321805 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="ceilometer-central-agent" containerID="cri-o://1118eb4b1e4171cb3d663e7f4a87b7c13f51a7f00ab7659e586d0b5c2c518d12" gracePeriod=30 Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.321869 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="proxy-httpd" containerID="cri-o://603e5fb1c5b5f4515460a4762b0fb2e3e2cda34cdf077e1429bc2e7e85fa3436" gracePeriod=30 Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.321960 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="sg-core" containerID="cri-o://fe5b8da807d053532b8d5b7a876092045cce18f1599255f3fdb23be11a472a13" gracePeriod=30 Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.322085 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="ceilometer-notification-agent" containerID="cri-o://3697beb5a38c159a124a16293b7b50a54fa83bbe07926ebb3ff3967c0594b670" gracePeriod=30 Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.331861 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.769642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" event={"ID":"57f35f5f-1517-41b4-b354-59fd90d8fea5","Type":"ContainerStarted","Data":"ff59889fe2e27728b2c128006b814fb44c67cbdca652ed0b17940eea61ae9532"} Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.769699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" event={"ID":"57f35f5f-1517-41b4-b354-59fd90d8fea5","Type":"ContainerStarted","Data":"37e38423ec9f96054edd1e869bb846099217fb3fcdf6dc9ef21b4516fbf05443"} Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.780311 4707 generic.go:334] "Generic (PLEG): container finished" podID="b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" containerID="d49da7e672e09ce87908fdf2e4bdd2652a410c735912013b3f4ad88f08a97ca8" exitCode=0 Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.780401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" event={"ID":"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f","Type":"ContainerDied","Data":"d49da7e672e09ce87908fdf2e4bdd2652a410c735912013b3f4ad88f08a97ca8"} Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.785689 4707 generic.go:334] "Generic (PLEG): container finished" podID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerID="603e5fb1c5b5f4515460a4762b0fb2e3e2cda34cdf077e1429bc2e7e85fa3436" exitCode=0 Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.785733 4707 generic.go:334] "Generic (PLEG): container finished" podID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerID="fe5b8da807d053532b8d5b7a876092045cce18f1599255f3fdb23be11a472a13" exitCode=2 Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.785807 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerDied","Data":"603e5fb1c5b5f4515460a4762b0fb2e3e2cda34cdf077e1429bc2e7e85fa3436"} Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.785842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerDied","Data":"fe5b8da807d053532b8d5b7a876092045cce18f1599255f3fdb23be11a472a13"} Jan 29 03:47:16 crc kubenswrapper[4707]: I0129 03:47:16.793810 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bcfc656b8-jq7ns" event={"ID":"ffa0c86e-66be-4f34-851f-4908eb22614f","Type":"ContainerStarted","Data":"0d00a448fa4e3d0b8a011563b64e623b32b9143cfe469f4028968cae81d389f7"} Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.841171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" event={"ID":"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f","Type":"ContainerStarted","Data":"a61a8c84b6b80ff80b6088aedac76a006021d4d1797b4c1f84fa01f85dfc71d4"} Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.841923 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.855649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" event={"ID":"57f35f5f-1517-41b4-b354-59fd90d8fea5","Type":"ContainerStarted","Data":"3811cc8e8ce9a146cbf07ce76275d9a5c57842ed5aeb19bc234a1e5ad7a8dcd9"} Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.855696 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.855722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.862434 4707 generic.go:334] "Generic (PLEG): container finished" podID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerID="3697beb5a38c159a124a16293b7b50a54fa83bbe07926ebb3ff3967c0594b670" exitCode=0 Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.862473 4707 generic.go:334] "Generic (PLEG): container finished" podID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerID="1118eb4b1e4171cb3d663e7f4a87b7c13f51a7f00ab7659e586d0b5c2c518d12" exitCode=0 Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.862505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerDied","Data":"3697beb5a38c159a124a16293b7b50a54fa83bbe07926ebb3ff3967c0594b670"} Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.862545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerDied","Data":"1118eb4b1e4171cb3d663e7f4a87b7c13f51a7f00ab7659e586d0b5c2c518d12"} Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.881375 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" podStartSLOduration=3.881348831 podStartE2EDuration="3.881348831s" podCreationTimestamp="2026-01-29 03:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:17.875165675 +0000 UTC m=+1191.359394590" watchObservedRunningTime="2026-01-29 03:47:17.881348831 +0000 UTC m=+1191.365577736" Jan 29 03:47:17 crc kubenswrapper[4707]: I0129 03:47:17.910862 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" podStartSLOduration=2.910840462 podStartE2EDuration="2.910840462s" podCreationTimestamp="2026-01-29 03:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:17.900178238 +0000 UTC m=+1191.384407143" watchObservedRunningTime="2026-01-29 03:47:17.910840462 +0000 UTC m=+1191.395069367" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.186877 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5wxq8"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.190506 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.203923 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5wxq8"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.256872 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd2dc\" (UniqueName: \"kubernetes.io/projected/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-kube-api-access-xd2dc\") pod \"nova-api-db-create-5wxq8\" (UID: \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\") " pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.256960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-operator-scripts\") pod \"nova-api-db-create-5wxq8\" (UID: \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\") " pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.337610 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-lzdkh"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.339674 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.349847 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3f4f-account-create-update-jdxff"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.351097 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.355906 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.362717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd2dc\" (UniqueName: \"kubernetes.io/projected/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-kube-api-access-xd2dc\") pod \"nova-api-db-create-5wxq8\" (UID: \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\") " pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.362834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-operator-scripts\") pod \"nova-api-db-create-5wxq8\" (UID: \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\") " pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.387155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-operator-scripts\") pod \"nova-api-db-create-5wxq8\" (UID: \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\") " pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.391432 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lzdkh"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.408738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd2dc\" (UniqueName: \"kubernetes.io/projected/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-kube-api-access-xd2dc\") pod \"nova-api-db-create-5wxq8\" (UID: \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\") " pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.446548 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3f4f-account-create-update-jdxff"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.479352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lrzs\" (UniqueName: \"kubernetes.io/projected/3d368b24-5c6c-4828-af36-7d553daeee3c-kube-api-access-9lrzs\") pod \"nova-cell0-db-create-lzdkh\" (UID: \"3d368b24-5c6c-4828-af36-7d553daeee3c\") " pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.479968 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds4qg\" (UniqueName: \"kubernetes.io/projected/9b34d90a-4443-41fb-8220-d07465c9faa1-kube-api-access-ds4qg\") pod \"nova-api-3f4f-account-create-update-jdxff\" (UID: \"9b34d90a-4443-41fb-8220-d07465c9faa1\") " pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.480037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b34d90a-4443-41fb-8220-d07465c9faa1-operator-scripts\") pod \"nova-api-3f4f-account-create-update-jdxff\" (UID: \"9b34d90a-4443-41fb-8220-d07465c9faa1\") " pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.480118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d368b24-5c6c-4828-af36-7d553daeee3c-operator-scripts\") pod \"nova-cell0-db-create-lzdkh\" (UID: \"3d368b24-5c6c-4828-af36-7d553daeee3c\") " pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.565783 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.573239 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.583270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d368b24-5c6c-4828-af36-7d553daeee3c-operator-scripts\") pod \"nova-cell0-db-create-lzdkh\" (UID: \"3d368b24-5c6c-4828-af36-7d553daeee3c\") " pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.583368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lrzs\" (UniqueName: \"kubernetes.io/projected/3d368b24-5c6c-4828-af36-7d553daeee3c-kube-api-access-9lrzs\") pod \"nova-cell0-db-create-lzdkh\" (UID: \"3d368b24-5c6c-4828-af36-7d553daeee3c\") " pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.583465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds4qg\" (UniqueName: \"kubernetes.io/projected/9b34d90a-4443-41fb-8220-d07465c9faa1-kube-api-access-ds4qg\") pod \"nova-api-3f4f-account-create-update-jdxff\" (UID: \"9b34d90a-4443-41fb-8220-d07465c9faa1\") " pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.583595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b34d90a-4443-41fb-8220-d07465c9faa1-operator-scripts\") pod \"nova-api-3f4f-account-create-update-jdxff\" (UID: \"9b34d90a-4443-41fb-8220-d07465c9faa1\") " pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.584348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b34d90a-4443-41fb-8220-d07465c9faa1-operator-scripts\") pod \"nova-api-3f4f-account-create-update-jdxff\" (UID: \"9b34d90a-4443-41fb-8220-d07465c9faa1\") " pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.584674 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-aabf-account-create-update-xhfqq"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.584793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d368b24-5c6c-4828-af36-7d553daeee3c-operator-scripts\") pod \"nova-cell0-db-create-lzdkh\" (UID: \"3d368b24-5c6c-4828-af36-7d553daeee3c\") " pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.586495 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.594260 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.615589 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aabf-account-create-update-xhfqq"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.626501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lrzs\" (UniqueName: \"kubernetes.io/projected/3d368b24-5c6c-4828-af36-7d553daeee3c-kube-api-access-9lrzs\") pod \"nova-cell0-db-create-lzdkh\" (UID: \"3d368b24-5c6c-4828-af36-7d553daeee3c\") " pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.633339 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ptdvm"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.635312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.663744 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds4qg\" (UniqueName: \"kubernetes.io/projected/9b34d90a-4443-41fb-8220-d07465c9faa1-kube-api-access-ds4qg\") pod \"nova-api-3f4f-account-create-update-jdxff\" (UID: \"9b34d90a-4443-41fb-8220-d07465c9faa1\") " pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.663848 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ptdvm"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.697564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snfw7\" (UniqueName: \"kubernetes.io/projected/4264d1cc-366e-412c-9f2d-8276c6b5fc70-kube-api-access-snfw7\") pod \"nova-cell0-aabf-account-create-update-xhfqq\" (UID: \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\") " pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.697642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4264d1cc-366e-412c-9f2d-8276c6b5fc70-operator-scripts\") pod \"nova-cell0-aabf-account-create-update-xhfqq\" (UID: \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\") " pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.787282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.800520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snfw7\" (UniqueName: \"kubernetes.io/projected/4264d1cc-366e-412c-9f2d-8276c6b5fc70-kube-api-access-snfw7\") pod \"nova-cell0-aabf-account-create-update-xhfqq\" (UID: \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\") " pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.800595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4264d1cc-366e-412c-9f2d-8276c6b5fc70-operator-scripts\") pod \"nova-cell0-aabf-account-create-update-xhfqq\" (UID: \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\") " pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.800643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa6d4aa3-52da-4273-8dae-1ac01656cab9-operator-scripts\") pod \"nova-cell1-db-create-ptdvm\" (UID: \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\") " pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.800736 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fqx6\" (UniqueName: \"kubernetes.io/projected/fa6d4aa3-52da-4273-8dae-1ac01656cab9-kube-api-access-5fqx6\") pod \"nova-cell1-db-create-ptdvm\" (UID: \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\") " pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.802050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4264d1cc-366e-412c-9f2d-8276c6b5fc70-operator-scripts\") pod \"nova-cell0-aabf-account-create-update-xhfqq\" (UID: \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\") " pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.835754 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.838738 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-bbea-account-create-update-jcvzk"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.840306 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.843672 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.846529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snfw7\" (UniqueName: \"kubernetes.io/projected/4264d1cc-366e-412c-9f2d-8276c6b5fc70-kube-api-access-snfw7\") pod \"nova-cell0-aabf-account-create-update-xhfqq\" (UID: \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\") " pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.850064 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bbea-account-create-update-jcvzk"] Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.903406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa6d4aa3-52da-4273-8dae-1ac01656cab9-operator-scripts\") pod \"nova-cell1-db-create-ptdvm\" (UID: \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\") " pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.905661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fqx6\" (UniqueName: \"kubernetes.io/projected/fa6d4aa3-52da-4273-8dae-1ac01656cab9-kube-api-access-5fqx6\") pod \"nova-cell1-db-create-ptdvm\" (UID: \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\") " pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.905725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa6d4aa3-52da-4273-8dae-1ac01656cab9-operator-scripts\") pod \"nova-cell1-db-create-ptdvm\" (UID: \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\") " pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:18 crc kubenswrapper[4707]: I0129 03:47:18.940216 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fqx6\" (UniqueName: \"kubernetes.io/projected/fa6d4aa3-52da-4273-8dae-1ac01656cab9-kube-api-access-5fqx6\") pod \"nova-cell1-db-create-ptdvm\" (UID: \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\") " pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.007715 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mt4v\" (UniqueName: \"kubernetes.io/projected/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-kube-api-access-9mt4v\") pod \"nova-cell1-bbea-account-create-update-jcvzk\" (UID: \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\") " pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.008201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-operator-scripts\") pod \"nova-cell1-bbea-account-create-update-jcvzk\" (UID: \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\") " pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.075099 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.083288 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.111013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-operator-scripts\") pod \"nova-cell1-bbea-account-create-update-jcvzk\" (UID: \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\") " pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.111100 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mt4v\" (UniqueName: \"kubernetes.io/projected/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-kube-api-access-9mt4v\") pod \"nova-cell1-bbea-account-create-update-jcvzk\" (UID: \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\") " pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.112292 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-operator-scripts\") pod \"nova-cell1-bbea-account-create-update-jcvzk\" (UID: \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\") " pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.142150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mt4v\" (UniqueName: \"kubernetes.io/projected/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-kube-api-access-9mt4v\") pod \"nova-cell1-bbea-account-create-update-jcvzk\" (UID: \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\") " pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.195626 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.737915 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.831523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-log-httpd\") pod \"32831b02-3e9c-449c-82ce-e134b50ceec4\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.832012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-scripts\") pod \"32831b02-3e9c-449c-82ce-e134b50ceec4\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.832337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "32831b02-3e9c-449c-82ce-e134b50ceec4" (UID: "32831b02-3e9c-449c-82ce-e134b50ceec4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.832380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-sg-core-conf-yaml\") pod \"32831b02-3e9c-449c-82ce-e134b50ceec4\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.833932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "32831b02-3e9c-449c-82ce-e134b50ceec4" (UID: "32831b02-3e9c-449c-82ce-e134b50ceec4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.833971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-run-httpd\") pod \"32831b02-3e9c-449c-82ce-e134b50ceec4\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.834003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-config-data\") pod \"32831b02-3e9c-449c-82ce-e134b50ceec4\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.834115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msdkr\" (UniqueName: \"kubernetes.io/projected/32831b02-3e9c-449c-82ce-e134b50ceec4-kube-api-access-msdkr\") pod \"32831b02-3e9c-449c-82ce-e134b50ceec4\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.834250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-combined-ca-bundle\") pod \"32831b02-3e9c-449c-82ce-e134b50ceec4\" (UID: \"32831b02-3e9c-449c-82ce-e134b50ceec4\") " Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.834821 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.834833 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32831b02-3e9c-449c-82ce-e134b50ceec4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.836912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-scripts" (OuterVolumeSpecName: "scripts") pod "32831b02-3e9c-449c-82ce-e134b50ceec4" (UID: "32831b02-3e9c-449c-82ce-e134b50ceec4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.848456 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32831b02-3e9c-449c-82ce-e134b50ceec4-kube-api-access-msdkr" (OuterVolumeSpecName: "kube-api-access-msdkr") pod "32831b02-3e9c-449c-82ce-e134b50ceec4" (UID: "32831b02-3e9c-449c-82ce-e134b50ceec4"). InnerVolumeSpecName "kube-api-access-msdkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.925047 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "32831b02-3e9c-449c-82ce-e134b50ceec4" (UID: "32831b02-3e9c-449c-82ce-e134b50ceec4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.937454 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.937491 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.937515 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msdkr\" (UniqueName: \"kubernetes.io/projected/32831b02-3e9c-449c-82ce-e134b50ceec4-kube-api-access-msdkr\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.938891 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.939510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32831b02-3e9c-449c-82ce-e134b50ceec4","Type":"ContainerDied","Data":"b05d3cfa3c3360dbd3163cb35da1d7a92af3d8b94c178a37e158f30bf93a5b6b"} Jan 29 03:47:19 crc kubenswrapper[4707]: I0129 03:47:19.939565 4707 scope.go:117] "RemoveContainer" containerID="603e5fb1c5b5f4515460a4762b0fb2e3e2cda34cdf077e1429bc2e7e85fa3436" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.047597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-config-data" (OuterVolumeSpecName: "config-data") pod "32831b02-3e9c-449c-82ce-e134b50ceec4" (UID: "32831b02-3e9c-449c-82ce-e134b50ceec4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.049791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32831b02-3e9c-449c-82ce-e134b50ceec4" (UID: "32831b02-3e9c-449c-82ce-e134b50ceec4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.142011 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.142056 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32831b02-3e9c-449c-82ce-e134b50ceec4-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.296131 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.306266 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.360439 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:20 crc kubenswrapper[4707]: E0129 03:47:20.360931 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="ceilometer-central-agent" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.360952 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="ceilometer-central-agent" Jan 29 03:47:20 crc kubenswrapper[4707]: E0129 03:47:20.360970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="ceilometer-notification-agent" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.360980 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="ceilometer-notification-agent" Jan 29 03:47:20 crc kubenswrapper[4707]: E0129 03:47:20.360998 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="sg-core" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.361004 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="sg-core" Jan 29 03:47:20 crc kubenswrapper[4707]: E0129 03:47:20.361024 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="proxy-httpd" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.361030 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="proxy-httpd" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.361216 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="proxy-httpd" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.361229 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="sg-core" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.361244 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="ceilometer-central-agent" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.361251 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" containerName="ceilometer-notification-agent" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.362956 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.367835 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.367900 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.433655 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5wxq8"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.450912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.450965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-scripts\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.451012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-log-httpd\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.451039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-config-data\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.451092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-run-httpd\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.451150 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.451175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjll\" (UniqueName: \"kubernetes.io/projected/e891f1aa-f142-469b-891f-01f8930884d3-kube-api-access-9sjll\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.480678 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.557047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-log-httpd\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.557114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-config-data\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.557176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-run-httpd\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.557243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.557269 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjll\" (UniqueName: \"kubernetes.io/projected/e891f1aa-f142-469b-891f-01f8930884d3-kube-api-access-9sjll\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.557330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-scripts\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.557349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.558257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-run-httpd\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.558844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-log-httpd\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.570797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-scripts\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.582315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.587248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.660839 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7b6695978c-nxvfw"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.661426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjll\" (UniqueName: \"kubernetes.io/projected/e891f1aa-f142-469b-891f-01f8930884d3-kube-api-access-9sjll\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.664152 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.671052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-config-data\") pod \"ceilometer-0\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.704770 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.708486 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6d64f59df8-kkdq9"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.711014 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.723727 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7b6695978c-nxvfw"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.738674 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6d64f59df8-kkdq9"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.751089 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5bf5cf4876-9r9d8"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.753111 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.761912 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5bf5cf4876-9r9d8"] Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.776652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data-custom\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.776692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8zxn\" (UniqueName: \"kubernetes.io/projected/0644f450-79f9-4f20-9476-e31d4b673507-kube-api-access-c8zxn\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.776748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ddee41-4c3c-4b7b-b637-2de751496d37-config-data\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.776777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ddee41-4c3c-4b7b-b637-2de751496d37-combined-ca-bundle\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.776800 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52ddee41-4c3c-4b7b-b637-2de751496d37-config-data-custom\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.776836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-combined-ca-bundle\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.776893 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4stj6\" (UniqueName: \"kubernetes.io/projected/52ddee41-4c3c-4b7b-b637-2de751496d37-kube-api-access-4stj6\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.776917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ddee41-4c3c-4b7b-b637-2de751496d37-config-data\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ddee41-4c3c-4b7b-b637-2de751496d37-combined-ca-bundle\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879311 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52ddee41-4c3c-4b7b-b637-2de751496d37-config-data-custom\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-combined-ca-bundle\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-combined-ca-bundle\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879409 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmqmt\" (UniqueName: \"kubernetes.io/projected/69cae86f-c7b4-4298-b9c5-6925a215df89-kube-api-access-tmqmt\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4stj6\" (UniqueName: \"kubernetes.io/projected/52ddee41-4c3c-4b7b-b637-2de751496d37-kube-api-access-4stj6\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879488 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data-custom\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879569 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data-custom\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.879589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8zxn\" (UniqueName: \"kubernetes.io/projected/0644f450-79f9-4f20-9476-e31d4b673507-kube-api-access-c8zxn\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.885723 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ddee41-4c3c-4b7b-b637-2de751496d37-config-data\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.888276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ddee41-4c3c-4b7b-b637-2de751496d37-combined-ca-bundle\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.898001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-combined-ca-bundle\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.898320 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.899043 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52ddee41-4c3c-4b7b-b637-2de751496d37-config-data-custom\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.903027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data-custom\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.905745 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4stj6\" (UniqueName: \"kubernetes.io/projected/52ddee41-4c3c-4b7b-b637-2de751496d37-kube-api-access-4stj6\") pod \"heat-engine-7b6695978c-nxvfw\" (UID: \"52ddee41-4c3c-4b7b-b637-2de751496d37\") " pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.910636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8zxn\" (UniqueName: \"kubernetes.io/projected/0644f450-79f9-4f20-9476-e31d4b673507-kube-api-access-c8zxn\") pod \"heat-api-6d64f59df8-kkdq9\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.981137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-combined-ca-bundle\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.981208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.981248 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmqmt\" (UniqueName: \"kubernetes.io/projected/69cae86f-c7b4-4298-b9c5-6925a215df89-kube-api-access-tmqmt\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.981292 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data-custom\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.985731 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.986427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-combined-ca-bundle\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:20 crc kubenswrapper[4707]: I0129 03:47:20.986432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data-custom\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:21 crc kubenswrapper[4707]: I0129 03:47:21.005074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmqmt\" (UniqueName: \"kubernetes.io/projected/69cae86f-c7b4-4298-b9c5-6925a215df89-kube-api-access-tmqmt\") pod \"heat-cfnapi-5bf5cf4876-9r9d8\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:21 crc kubenswrapper[4707]: I0129 03:47:21.028659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:21 crc kubenswrapper[4707]: I0129 03:47:21.037845 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:21 crc kubenswrapper[4707]: I0129 03:47:21.073002 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:21 crc kubenswrapper[4707]: I0129 03:47:21.258853 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32831b02-3e9c-449c-82ce-e134b50ceec4" path="/var/lib/kubelet/pods/32831b02-3e9c-449c-82ce-e134b50ceec4/volumes" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.096103 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5bcfc656b8-jq7ns"] Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.119847 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7bdf7467d4-ptkdm"] Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.129363 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-74d4777d5f-4mj7v"] Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.132983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.135777 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.135793 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.194793 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-74d4777d5f-4mj7v"] Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.208259 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-public-tls-certs\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.208381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-internal-tls-certs\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.208443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-combined-ca-bundle\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.208470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-config-data\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.208500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs5qm\" (UniqueName: \"kubernetes.io/projected/9b19f31f-481f-4feb-91bb-09df20de5654-kube-api-access-gs5qm\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.208516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-config-data-custom\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.212690 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-78c45ff765-hw8sk"] Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.214300 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.219468 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.219524 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.259900 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-78c45ff765-hw8sk"] Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.317624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-public-tls-certs\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318127 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-config-data\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpkbs\" (UniqueName: \"kubernetes.io/projected/0f27421c-79af-4e0d-b97f-c1d73b2524e2-kube-api-access-wpkbs\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318215 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-combined-ca-bundle\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318247 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-config-data-custom\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-internal-tls-certs\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-combined-ca-bundle\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318350 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-config-data\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs5qm\" (UniqueName: \"kubernetes.io/projected/9b19f31f-481f-4feb-91bb-09df20de5654-kube-api-access-gs5qm\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318386 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-config-data-custom\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-internal-tls-certs\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.318456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-public-tls-certs\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.331512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-internal-tls-certs\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.331862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-public-tls-certs\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.332482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-config-data-custom\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.341273 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs5qm\" (UniqueName: \"kubernetes.io/projected/9b19f31f-481f-4feb-91bb-09df20de5654-kube-api-access-gs5qm\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.357983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-combined-ca-bundle\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.359121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b19f31f-481f-4feb-91bb-09df20de5654-config-data\") pod \"heat-api-74d4777d5f-4mj7v\" (UID: \"9b19f31f-481f-4feb-91bb-09df20de5654\") " pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.420564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpkbs\" (UniqueName: \"kubernetes.io/projected/0f27421c-79af-4e0d-b97f-c1d73b2524e2-kube-api-access-wpkbs\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.421145 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-combined-ca-bundle\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.421173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-config-data-custom\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.421259 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-internal-tls-certs\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.421289 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-public-tls-certs\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.421380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-config-data\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.425479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-combined-ca-bundle\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.425985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-public-tls-certs\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.427339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-config-data\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.436476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-config-data-custom\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.440226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f27421c-79af-4e0d-b97f-c1d73b2524e2-internal-tls-certs\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.447673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpkbs\" (UniqueName: \"kubernetes.io/projected/0f27421c-79af-4e0d-b97f-c1d73b2524e2-kube-api-access-wpkbs\") pod \"heat-cfnapi-78c45ff765-hw8sk\" (UID: \"0f27421c-79af-4e0d-b97f-c1d73b2524e2\") " pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.511406 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:22 crc kubenswrapper[4707]: I0129 03:47:22.551400 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:24 crc kubenswrapper[4707]: I0129 03:47:24.907717 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:47:25 crc kubenswrapper[4707]: I0129 03:47:25.005788 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbzsq"] Jan 29 03:47:25 crc kubenswrapper[4707]: I0129 03:47:25.006039 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" podUID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" containerName="dnsmasq-dns" containerID="cri-o://503feb7ea47f72e524427b1131df8bfd7e2d320d9513a05384b1d9c1a6962c52" gracePeriod=10 Jan 29 03:47:25 crc kubenswrapper[4707]: I0129 03:47:25.458708 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:25 crc kubenswrapper[4707]: I0129 03:47:25.460075 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" Jan 29 03:47:26 crc kubenswrapper[4707]: I0129 03:47:26.037274 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" containerID="503feb7ea47f72e524427b1131df8bfd7e2d320d9513a05384b1d9c1a6962c52" exitCode=0 Jan 29 03:47:26 crc kubenswrapper[4707]: I0129 03:47:26.037401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" event={"ID":"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92","Type":"ContainerDied","Data":"503feb7ea47f72e524427b1131df8bfd7e2d320d9513a05384b1d9c1a6962c52"} Jan 29 03:47:26 crc kubenswrapper[4707]: I0129 03:47:26.687558 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" podUID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.168:5353: connect: connection refused" Jan 29 03:47:27 crc kubenswrapper[4707]: I0129 03:47:27.724348 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f8ffb664f-gwtlc" Jan 29 03:47:27 crc kubenswrapper[4707]: I0129 03:47:27.764850 4707 scope.go:117] "RemoveContainer" containerID="fe5b8da807d053532b8d5b7a876092045cce18f1599255f3fdb23be11a472a13" Jan 29 03:47:27 crc kubenswrapper[4707]: I0129 03:47:27.821576 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f44bf9c6d-746dh"] Jan 29 03:47:27 crc kubenswrapper[4707]: I0129 03:47:27.821862 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f44bf9c6d-746dh" podUID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerName="neutron-api" containerID="cri-o://479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f" gracePeriod=30 Jan 29 03:47:27 crc kubenswrapper[4707]: I0129 03:47:27.822343 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f44bf9c6d-746dh" podUID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerName="neutron-httpd" containerID="cri-o://9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd" gracePeriod=30 Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.137862 4707 generic.go:334] "Generic (PLEG): container finished" podID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerID="9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd" exitCode=0 Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.142895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f44bf9c6d-746dh" event={"ID":"e70752bb-f7b2-4cd4-ace7-b64b837a8e95","Type":"ContainerDied","Data":"9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd"} Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.150705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5wxq8" event={"ID":"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce","Type":"ContainerStarted","Data":"02709186526e20d4178f93697f6e23f91ca199b7fc2f5978bced8b6b4b8674a8"} Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.280131 4707 scope.go:117] "RemoveContainer" containerID="3697beb5a38c159a124a16293b7b50a54fa83bbe07926ebb3ff3967c0594b670" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.424414 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.516141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-swift-storage-0\") pod \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.516613 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mht99\" (UniqueName: \"kubernetes.io/projected/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-kube-api-access-mht99\") pod \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.516665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-nb\") pod \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.516706 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-svc\") pod \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.516743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-sb\") pod \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.516828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-config\") pod \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\" (UID: \"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92\") " Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.528051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-kube-api-access-mht99" (OuterVolumeSpecName: "kube-api-access-mht99") pod "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" (UID: "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92"). InnerVolumeSpecName "kube-api-access-mht99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.619881 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mht99\" (UniqueName: \"kubernetes.io/projected/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-kube-api-access-mht99\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.692988 4707 scope.go:117] "RemoveContainer" containerID="1118eb4b1e4171cb3d663e7f4a87b7c13f51a7f00ab7659e586d0b5c2c518d12" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.696448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" (UID: "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.696867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" (UID: "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.736878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" (UID: "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.749314 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.749358 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.749369 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.754859 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.775875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3f4f-account-create-update-jdxff"] Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.797021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" (UID: "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.802037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-config" (OuterVolumeSpecName: "config") pod "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" (UID: "bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.854202 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.854235 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.972162 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bbea-account-create-update-jcvzk"] Jan 29 03:47:28 crc kubenswrapper[4707]: I0129 03:47:28.995142 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lzdkh"] Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.015021 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aabf-account-create-update-xhfqq"] Jan 29 03:47:29 crc kubenswrapper[4707]: W0129 03:47:29.039674 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d368b24_5c6c_4828_af36_7d553daeee3c.slice/crio-81d52299c251b1cc855b32cc373aa836e250cba2a732532a887752967c3fc5ef WatchSource:0}: Error finding container 81d52299c251b1cc855b32cc373aa836e250cba2a732532a887752967c3fc5ef: Status 404 returned error can't find the container with id 81d52299c251b1cc855b32cc373aa836e250cba2a732532a887752967c3fc5ef Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.041362 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ptdvm"] Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.085981 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.127886 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.200614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" event={"ID":"4264d1cc-366e-412c-9f2d-8276c6b5fc70","Type":"ContainerStarted","Data":"72f4cef6b08383a7028b6fe064b13dff00a7a8364c3083261de59f745393a106"} Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.205584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" event={"ID":"0ec32f0d-8609-4579-abc4-1af5f98df4cf","Type":"ContainerStarted","Data":"35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939"} Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.205772 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" podUID="0ec32f0d-8609-4579-abc4-1af5f98df4cf" containerName="heat-cfnapi" containerID="cri-o://35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939" gracePeriod=60 Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.206045 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.215088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" event={"ID":"bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92","Type":"ContainerDied","Data":"b4c19a3847c6ef6ddac28b3c557a0413d871442d889b16296c5a765d7aff1e65"} Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.215183 4707 scope.go:117] "RemoveContainer" containerID="503feb7ea47f72e524427b1131df8bfd7e2d320d9513a05384b1d9c1a6962c52" Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.215405 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-hbzsq" Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.220500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" event={"ID":"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64","Type":"ContainerStarted","Data":"70cc4dc92905c522cc97958f33f06624733942d8d72dc869a668f031b28c26f7"} Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.232500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ptdvm" event={"ID":"fa6d4aa3-52da-4273-8dae-1ac01656cab9","Type":"ContainerStarted","Data":"dd346bd17926df16b0b84354bffabb19afac469ffe4f13e072d0af1357d3e536"} Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.250811 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" podStartSLOduration=11.030310977 podStartE2EDuration="15.25078987s" podCreationTimestamp="2026-01-29 03:47:14 +0000 UTC" firstStartedPulling="2026-01-29 03:47:15.584779144 +0000 UTC m=+1189.069008049" lastFinishedPulling="2026-01-29 03:47:19.805258037 +0000 UTC m=+1193.289486942" observedRunningTime="2026-01-29 03:47:29.238433058 +0000 UTC m=+1202.722661963" watchObservedRunningTime="2026-01-29 03:47:29.25078987 +0000 UTC m=+1202.735018775" Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.253241 4707 scope.go:117] "RemoveContainer" containerID="c73244cc348c64a8b54de140e7180144e315f14a6a60e1ccb0321e280bb70416" Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.286369 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lzdkh" event={"ID":"3d368b24-5c6c-4828-af36-7d553daeee3c","Type":"ContainerStarted","Data":"81d52299c251b1cc855b32cc373aa836e250cba2a732532a887752967c3fc5ef"} Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.286423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3f4f-account-create-update-jdxff" event={"ID":"9b34d90a-4443-41fb-8220-d07465c9faa1","Type":"ContainerStarted","Data":"03bdeeab01e5e6819ec47ed688d2b5db9061320992ccfc9c0481e02f8798441f"} Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.323608 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbzsq"] Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.336751 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6d64f59df8-kkdq9"] Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.358662 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbzsq"] Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.371398 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.390603 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-74d4777d5f-4mj7v"] Jan 29 03:47:29 crc kubenswrapper[4707]: W0129 03:47:29.395369 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode891f1aa_f142_469b_891f_01f8930884d3.slice/crio-1f985a7694692100502ce2f7a23a87a2d110366f998da047683adac61c916492 WatchSource:0}: Error finding container 1f985a7694692100502ce2f7a23a87a2d110366f998da047683adac61c916492: Status 404 returned error can't find the container with id 1f985a7694692100502ce2f7a23a87a2d110366f998da047683adac61c916492 Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.449254 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7b6695978c-nxvfw"] Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.502117 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-78c45ff765-hw8sk"] Jan 29 03:47:29 crc kubenswrapper[4707]: I0129 03:47:29.686371 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5bf5cf4876-9r9d8"] Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.286214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerStarted","Data":"1f985a7694692100502ce2f7a23a87a2d110366f998da047683adac61c916492"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.313454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lzdkh" event={"ID":"3d368b24-5c6c-4828-af36-7d553daeee3c","Type":"ContainerStarted","Data":"bec773fe3ccd0405d46c8774cb69869fba69c76270893dd022abf7bc1e25a200"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.330599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d64f59df8-kkdq9" event={"ID":"0644f450-79f9-4f20-9476-e31d4b673507","Type":"ContainerStarted","Data":"7dfa1052be66ce6b809419604907016c2068c17840a03451d083e3c33faad9f6"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.347170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-74d4777d5f-4mj7v" event={"ID":"9b19f31f-481f-4feb-91bb-09df20de5654","Type":"ContainerStarted","Data":"a3f457acd872808a88f0c692cae56a04fd1b632393579c0ab93fa3053fcdd0db"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.357054 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-lzdkh" podStartSLOduration=12.357026805 podStartE2EDuration="12.357026805s" podCreationTimestamp="2026-01-29 03:47:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:30.346192726 +0000 UTC m=+1203.830421631" watchObservedRunningTime="2026-01-29 03:47:30.357026805 +0000 UTC m=+1203.841255710" Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.357835 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3f4f-account-create-update-jdxff" event={"ID":"9b34d90a-4443-41fb-8220-d07465c9faa1","Type":"ContainerStarted","Data":"e802bb1aa8450675870feccb01861e8857fb6fec39300bf58877fb4a677f92f5"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.373032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5wxq8" event={"ID":"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce","Type":"ContainerStarted","Data":"c94e11f8ba76a194d102eab4b4403b3307d9fe0b9fc8a11ff064dc357716833a"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.385797 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-3f4f-account-create-update-jdxff" podStartSLOduration=12.385774094 podStartE2EDuration="12.385774094s" podCreationTimestamp="2026-01-29 03:47:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:30.375658426 +0000 UTC m=+1203.859887331" watchObservedRunningTime="2026-01-29 03:47:30.385774094 +0000 UTC m=+1203.870002999" Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.386738 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f44781c6-75de-479b-b6bb-33bc27d468fa","Type":"ContainerStarted","Data":"0932b0ac3dc2b05af0e58c966c543784158e63ace8b7014e5b797132c779395b"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.409672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ptdvm" event={"ID":"fa6d4aa3-52da-4273-8dae-1ac01656cab9","Type":"ContainerStarted","Data":"db24f942fa5db4ad356378e0559dff3af60fef107a60255ec0a6475caefa47d2"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.422903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bcfc656b8-jq7ns" event={"ID":"ffa0c86e-66be-4f34-851f-4908eb22614f","Type":"ContainerStarted","Data":"679cbfdc14bb3d07c8bfe13fc180dc5b68454c4c640b96b3614ea951896cee7d"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.423720 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.423720 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5bcfc656b8-jq7ns" podUID="ffa0c86e-66be-4f34-851f-4908eb22614f" containerName="heat-api" containerID="cri-o://679cbfdc14bb3d07c8bfe13fc180dc5b68454c4c640b96b3614ea951896cee7d" gracePeriod=60 Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.436125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-78c45ff765-hw8sk" event={"ID":"0f27421c-79af-4e0d-b97f-c1d73b2524e2","Type":"ContainerStarted","Data":"051245eb5f1a9b2b24c63bfe3442cb29af5741a7c2a2ddf88b93b98ac3481e48"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.445584 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b6695978c-nxvfw" event={"ID":"52ddee41-4c3c-4b7b-b637-2de751496d37","Type":"ContainerStarted","Data":"4dc43ae77f376a410f80393ec25e2dbfa1ae0b8508cd44cd1a8339414c5fea44"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.452115 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.259864011 podStartE2EDuration="20.452094524s" podCreationTimestamp="2026-01-29 03:47:10 +0000 UTC" firstStartedPulling="2026-01-29 03:47:11.016415489 +0000 UTC m=+1184.500644394" lastFinishedPulling="2026-01-29 03:47:28.208646002 +0000 UTC m=+1201.692874907" observedRunningTime="2026-01-29 03:47:30.423244852 +0000 UTC m=+1203.907473757" watchObservedRunningTime="2026-01-29 03:47:30.452094524 +0000 UTC m=+1203.936323429" Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.456842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" event={"ID":"69cae86f-c7b4-4298-b9c5-6925a215df89","Type":"ContainerStarted","Data":"20cd6c50fc70370e1997b9d33c03aaebcd619cf28d1def9429db3b40fa1b797e"} Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.503452 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-ptdvm" podStartSLOduration=12.503425927 podStartE2EDuration="12.503425927s" podCreationTimestamp="2026-01-29 03:47:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:30.442160741 +0000 UTC m=+1203.926389646" watchObservedRunningTime="2026-01-29 03:47:30.503425927 +0000 UTC m=+1203.987654832" Jan 29 03:47:30 crc kubenswrapper[4707]: I0129 03:47:30.524082 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5bcfc656b8-jq7ns" podStartSLOduration=12.435996196 podStartE2EDuration="16.524055395s" podCreationTimestamp="2026-01-29 03:47:14 +0000 UTC" firstStartedPulling="2026-01-29 03:47:15.749022045 +0000 UTC m=+1189.233250950" lastFinishedPulling="2026-01-29 03:47:19.837081244 +0000 UTC m=+1193.321310149" observedRunningTime="2026-01-29 03:47:30.463763536 +0000 UTC m=+1203.947992441" watchObservedRunningTime="2026-01-29 03:47:30.524055395 +0000 UTC m=+1204.008284300" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.255975 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" path="/var/lib/kubelet/pods/bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92/volumes" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.487356 4707 generic.go:334] "Generic (PLEG): container finished" podID="4264d1cc-366e-412c-9f2d-8276c6b5fc70" containerID="ce1f1515c142d5cfc720c2d0577fa37fa94df64c46af092314277a817349d2e2" exitCode=0 Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.488065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" event={"ID":"4264d1cc-366e-412c-9f2d-8276c6b5fc70","Type":"ContainerDied","Data":"ce1f1515c142d5cfc720c2d0577fa37fa94df64c46af092314277a817349d2e2"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.490302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-78c45ff765-hw8sk" event={"ID":"0f27421c-79af-4e0d-b97f-c1d73b2524e2","Type":"ContainerStarted","Data":"6fe814594078799964fc60eef29597a392d8cb95fbd8d0c38e691d962e94ede9"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.491414 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.494648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b6695978c-nxvfw" event={"ID":"52ddee41-4c3c-4b7b-b637-2de751496d37","Type":"ContainerStarted","Data":"c39c080fce8eb80ba8842ef8f52e2da31f5c2cb85f1b308099e063cd64a3d94a"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.494722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.501896 4707 generic.go:334] "Generic (PLEG): container finished" podID="69cae86f-c7b4-4298-b9c5-6925a215df89" containerID="279c0f8771b77a0572ff84fa44a8af13ee01cfa81330cf3d473f769b2eb7e89b" exitCode=1 Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.501963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" event={"ID":"69cae86f-c7b4-4298-b9c5-6925a215df89","Type":"ContainerDied","Data":"279c0f8771b77a0572ff84fa44a8af13ee01cfa81330cf3d473f769b2eb7e89b"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.502436 4707 scope.go:117] "RemoveContainer" containerID="279c0f8771b77a0572ff84fa44a8af13ee01cfa81330cf3d473f769b2eb7e89b" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.508880 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b79725c-fee3-4f6f-84b7-c0dbd52e75ce" containerID="c94e11f8ba76a194d102eab4b4403b3307d9fe0b9fc8a11ff064dc357716833a" exitCode=0 Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.508928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5wxq8" event={"ID":"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce","Type":"ContainerDied","Data":"c94e11f8ba76a194d102eab4b4403b3307d9fe0b9fc8a11ff064dc357716833a"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.524910 4707 generic.go:334] "Generic (PLEG): container finished" podID="fa6d4aa3-52da-4273-8dae-1ac01656cab9" containerID="db24f942fa5db4ad356378e0559dff3af60fef107a60255ec0a6475caefa47d2" exitCode=0 Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.524978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ptdvm" event={"ID":"fa6d4aa3-52da-4273-8dae-1ac01656cab9","Type":"ContainerDied","Data":"db24f942fa5db4ad356378e0559dff3af60fef107a60255ec0a6475caefa47d2"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.542965 4707 generic.go:334] "Generic (PLEG): container finished" podID="0644f450-79f9-4f20-9476-e31d4b673507" containerID="fb6306a4616a25fb57ef6e07cd8e59cd818f2fc89ec4acdc7bdb7c5a94ff2aba" exitCode=1 Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.543128 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d64f59df8-kkdq9" event={"ID":"0644f450-79f9-4f20-9476-e31d4b673507","Type":"ContainerDied","Data":"fb6306a4616a25fb57ef6e07cd8e59cd818f2fc89ec4acdc7bdb7c5a94ff2aba"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.544173 4707 scope.go:117] "RemoveContainer" containerID="fb6306a4616a25fb57ef6e07cd8e59cd818f2fc89ec4acdc7bdb7c5a94ff2aba" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.546752 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7b6695978c-nxvfw" podStartSLOduration=11.546728608 podStartE2EDuration="11.546728608s" podCreationTimestamp="2026-01-29 03:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:31.531191966 +0000 UTC m=+1205.015420871" watchObservedRunningTime="2026-01-29 03:47:31.546728608 +0000 UTC m=+1205.030957513" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.548303 4707 generic.go:334] "Generic (PLEG): container finished" podID="ffa0c86e-66be-4f34-851f-4908eb22614f" containerID="679cbfdc14bb3d07c8bfe13fc180dc5b68454c4c640b96b3614ea951896cee7d" exitCode=0 Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.548377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bcfc656b8-jq7ns" event={"ID":"ffa0c86e-66be-4f34-851f-4908eb22614f","Type":"ContainerDied","Data":"679cbfdc14bb3d07c8bfe13fc180dc5b68454c4c640b96b3614ea951896cee7d"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.548416 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bcfc656b8-jq7ns" event={"ID":"ffa0c86e-66be-4f34-851f-4908eb22614f","Type":"ContainerDied","Data":"0d00a448fa4e3d0b8a011563b64e623b32b9143cfe469f4028968cae81d389f7"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.548433 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d00a448fa4e3d0b8a011563b64e623b32b9143cfe469f4028968cae81d389f7" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.553279 4707 generic.go:334] "Generic (PLEG): container finished" podID="0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64" containerID="b9d204c3a77405fd52e4a98e1a2adcf4f0ad5d8b9ad58dbfaa1562a18242efb6" exitCode=0 Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.553365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" event={"ID":"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64","Type":"ContainerDied","Data":"b9d204c3a77405fd52e4a98e1a2adcf4f0ad5d8b9ad58dbfaa1562a18242efb6"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.560403 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerStarted","Data":"3c21fc1241bd5eebf97d5b7510c834ea1cc4be65a2ceaccaa79b5f0807fad485"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.568088 4707 generic.go:334] "Generic (PLEG): container finished" podID="3d368b24-5c6c-4828-af36-7d553daeee3c" containerID="bec773fe3ccd0405d46c8774cb69869fba69c76270893dd022abf7bc1e25a200" exitCode=0 Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.568162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lzdkh" event={"ID":"3d368b24-5c6c-4828-af36-7d553daeee3c","Type":"ContainerDied","Data":"bec773fe3ccd0405d46c8774cb69869fba69c76270893dd022abf7bc1e25a200"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.570075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-74d4777d5f-4mj7v" event={"ID":"9b19f31f-481f-4feb-91bb-09df20de5654","Type":"ContainerStarted","Data":"9f2716570e5f09ba333f0b43f9d5790f0b13fc904a5d948470f4eba1375748cc"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.570425 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.599249 4707 generic.go:334] "Generic (PLEG): container finished" podID="9b34d90a-4443-41fb-8220-d07465c9faa1" containerID="e802bb1aa8450675870feccb01861e8857fb6fec39300bf58877fb4a677f92f5" exitCode=0 Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.600306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3f4f-account-create-update-jdxff" event={"ID":"9b34d90a-4443-41fb-8220-d07465c9faa1","Type":"ContainerDied","Data":"e802bb1aa8450675870feccb01861e8857fb6fec39300bf58877fb4a677f92f5"} Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.602901 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-78c45ff765-hw8sk" podStartSLOduration=9.602889519 podStartE2EDuration="9.602889519s" podCreationTimestamp="2026-01-29 03:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:31.583061974 +0000 UTC m=+1205.067290879" watchObservedRunningTime="2026-01-29 03:47:31.602889519 +0000 UTC m=+1205.087118424" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.649328 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.714665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data-custom\") pod \"ffa0c86e-66be-4f34-851f-4908eb22614f\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.714713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data\") pod \"ffa0c86e-66be-4f34-851f-4908eb22614f\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.714795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-combined-ca-bundle\") pod \"ffa0c86e-66be-4f34-851f-4908eb22614f\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.715004 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7gz5\" (UniqueName: \"kubernetes.io/projected/ffa0c86e-66be-4f34-851f-4908eb22614f-kube-api-access-m7gz5\") pod \"ffa0c86e-66be-4f34-851f-4908eb22614f\" (UID: \"ffa0c86e-66be-4f34-851f-4908eb22614f\") " Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.735876 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-74d4777d5f-4mj7v" podStartSLOduration=9.735849528 podStartE2EDuration="9.735849528s" podCreationTimestamp="2026-01-29 03:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:31.716048614 +0000 UTC m=+1205.200277519" watchObservedRunningTime="2026-01-29 03:47:31.735849528 +0000 UTC m=+1205.220078433" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.744833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa0c86e-66be-4f34-851f-4908eb22614f-kube-api-access-m7gz5" (OuterVolumeSpecName: "kube-api-access-m7gz5") pod "ffa0c86e-66be-4f34-851f-4908eb22614f" (UID: "ffa0c86e-66be-4f34-851f-4908eb22614f"). InnerVolumeSpecName "kube-api-access-m7gz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.760557 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ffa0c86e-66be-4f34-851f-4908eb22614f" (UID: "ffa0c86e-66be-4f34-851f-4908eb22614f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.795727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffa0c86e-66be-4f34-851f-4908eb22614f" (UID: "ffa0c86e-66be-4f34-851f-4908eb22614f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.825040 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7gz5\" (UniqueName: \"kubernetes.io/projected/ffa0c86e-66be-4f34-851f-4908eb22614f-kube-api-access-m7gz5\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.825095 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.825106 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.841257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data" (OuterVolumeSpecName: "config-data") pod "ffa0c86e-66be-4f34-851f-4908eb22614f" (UID: "ffa0c86e-66be-4f34-851f-4908eb22614f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.927275 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffa0c86e-66be-4f34-851f-4908eb22614f-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:31 crc kubenswrapper[4707]: I0129 03:47:31.980350 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.038870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd2dc\" (UniqueName: \"kubernetes.io/projected/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-kube-api-access-xd2dc\") pod \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\" (UID: \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\") " Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.039107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-operator-scripts\") pod \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\" (UID: \"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce\") " Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.046028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b79725c-fee3-4f6f-84b7-c0dbd52e75ce" (UID: "7b79725c-fee3-4f6f-84b7-c0dbd52e75ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.062295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-kube-api-access-xd2dc" (OuterVolumeSpecName: "kube-api-access-xd2dc") pod "7b79725c-fee3-4f6f-84b7-c0dbd52e75ce" (UID: "7b79725c-fee3-4f6f-84b7-c0dbd52e75ce"). InnerVolumeSpecName "kube-api-access-xd2dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.146466 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd2dc\" (UniqueName: \"kubernetes.io/projected/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-kube-api-access-xd2dc\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.146499 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.200961 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.313050 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.451685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-httpd-config\") pod \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.451835 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-config\") pod \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.451916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmbc6\" (UniqueName: \"kubernetes.io/projected/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-kube-api-access-pmbc6\") pod \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.451971 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-ovndb-tls-certs\") pod \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.452046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-combined-ca-bundle\") pod \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\" (UID: \"e70752bb-f7b2-4cd4-ace7-b64b837a8e95\") " Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.458686 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-kube-api-access-pmbc6" (OuterVolumeSpecName: "kube-api-access-pmbc6") pod "e70752bb-f7b2-4cd4-ace7-b64b837a8e95" (UID: "e70752bb-f7b2-4cd4-ace7-b64b837a8e95"). InnerVolumeSpecName "kube-api-access-pmbc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.460264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e70752bb-f7b2-4cd4-ace7-b64b837a8e95" (UID: "e70752bb-f7b2-4cd4-ace7-b64b837a8e95"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.509724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-config" (OuterVolumeSpecName: "config") pod "e70752bb-f7b2-4cd4-ace7-b64b837a8e95" (UID: "e70752bb-f7b2-4cd4-ace7-b64b837a8e95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.513045 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e70752bb-f7b2-4cd4-ace7-b64b837a8e95" (UID: "e70752bb-f7b2-4cd4-ace7-b64b837a8e95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.533343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e70752bb-f7b2-4cd4-ace7-b64b837a8e95" (UID: "e70752bb-f7b2-4cd4-ace7-b64b837a8e95"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.555783 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.555820 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmbc6\" (UniqueName: \"kubernetes.io/projected/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-kube-api-access-pmbc6\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.555834 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.555845 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.555853 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e70752bb-f7b2-4cd4-ace7-b64b837a8e95-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.628427 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5wxq8" event={"ID":"7b79725c-fee3-4f6f-84b7-c0dbd52e75ce","Type":"ContainerDied","Data":"02709186526e20d4178f93697f6e23f91ca199b7fc2f5978bced8b6b4b8674a8"} Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.628483 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02709186526e20d4178f93697f6e23f91ca199b7fc2f5978bced8b6b4b8674a8" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.628499 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5wxq8" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.632496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerStarted","Data":"f80a33cf4cedf71734034d60585189ea9bc2c9e7d57fbcab67e2c2708bf0ff03"} Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.635007 4707 generic.go:334] "Generic (PLEG): container finished" podID="0644f450-79f9-4f20-9476-e31d4b673507" containerID="f3b690b9de0e54234e49a2ed96c2a83a738f962884b444cf049bd1c3a92afd51" exitCode=1 Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.635405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d64f59df8-kkdq9" event={"ID":"0644f450-79f9-4f20-9476-e31d4b673507","Type":"ContainerDied","Data":"f3b690b9de0e54234e49a2ed96c2a83a738f962884b444cf049bd1c3a92afd51"} Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.635517 4707 scope.go:117] "RemoveContainer" containerID="fb6306a4616a25fb57ef6e07cd8e59cd818f2fc89ec4acdc7bdb7c5a94ff2aba" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.636038 4707 scope.go:117] "RemoveContainer" containerID="f3b690b9de0e54234e49a2ed96c2a83a738f962884b444cf049bd1c3a92afd51" Jan 29 03:47:32 crc kubenswrapper[4707]: E0129 03:47:32.636286 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6d64f59df8-kkdq9_openstack(0644f450-79f9-4f20-9476-e31d4b673507)\"" pod="openstack/heat-api-6d64f59df8-kkdq9" podUID="0644f450-79f9-4f20-9476-e31d4b673507" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.649403 4707 generic.go:334] "Generic (PLEG): container finished" podID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerID="479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f" exitCode=0 Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.649485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f44bf9c6d-746dh" event={"ID":"e70752bb-f7b2-4cd4-ace7-b64b837a8e95","Type":"ContainerDied","Data":"479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f"} Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.649519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f44bf9c6d-746dh" event={"ID":"e70752bb-f7b2-4cd4-ace7-b64b837a8e95","Type":"ContainerDied","Data":"7d581f904e5abc065a51a67dc3a980c6ed1d27c9ce7b386c07500efa89132130"} Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.649680 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f44bf9c6d-746dh" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.664233 4707 generic.go:334] "Generic (PLEG): container finished" podID="69cae86f-c7b4-4298-b9c5-6925a215df89" containerID="9d129b477285d290e6f40cc01822bebe25d7339c926a81fe299d63d985426379" exitCode=1 Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.665193 4707 scope.go:117] "RemoveContainer" containerID="9d129b477285d290e6f40cc01822bebe25d7339c926a81fe299d63d985426379" Jan 29 03:47:32 crc kubenswrapper[4707]: E0129 03:47:32.665446 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5bf5cf4876-9r9d8_openstack(69cae86f-c7b4-4298-b9c5-6925a215df89)\"" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.665675 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" event={"ID":"69cae86f-c7b4-4298-b9c5-6925a215df89","Type":"ContainerDied","Data":"9d129b477285d290e6f40cc01822bebe25d7339c926a81fe299d63d985426379"} Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.665853 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bcfc656b8-jq7ns" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.801495 4707 scope.go:117] "RemoveContainer" containerID="9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.826215 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f44bf9c6d-746dh"] Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.841491 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f44bf9c6d-746dh"] Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.857106 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5bcfc656b8-jq7ns"] Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.869273 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5bcfc656b8-jq7ns"] Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.878483 4707 scope.go:117] "RemoveContainer" containerID="479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.914087 4707 scope.go:117] "RemoveContainer" containerID="9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd" Jan 29 03:47:32 crc kubenswrapper[4707]: E0129 03:47:32.915156 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd\": container with ID starting with 9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd not found: ID does not exist" containerID="9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.915198 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd"} err="failed to get container status \"9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd\": rpc error: code = NotFound desc = could not find container \"9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd\": container with ID starting with 9a4fbe38c9365f24a74aeafd9be7dc46cc2a690d54b58c2c131f5d84739311bd not found: ID does not exist" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.915235 4707 scope.go:117] "RemoveContainer" containerID="479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f" Jan 29 03:47:32 crc kubenswrapper[4707]: E0129 03:47:32.916180 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f\": container with ID starting with 479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f not found: ID does not exist" containerID="479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.916229 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f"} err="failed to get container status \"479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f\": rpc error: code = NotFound desc = could not find container \"479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f\": container with ID starting with 479696f805f67d6e7b9ee9b190f1edf97c75f14f018f56ed1e78c74b7bd38a6f not found: ID does not exist" Jan 29 03:47:32 crc kubenswrapper[4707]: I0129 03:47:32.916251 4707 scope.go:117] "RemoveContainer" containerID="279c0f8771b77a0572ff84fa44a8af13ee01cfa81330cf3d473f769b2eb7e89b" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.237300 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.254413 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" path="/var/lib/kubelet/pods/e70752bb-f7b2-4cd4-ace7-b64b837a8e95/volumes" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.255247 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa0c86e-66be-4f34-851f-4908eb22614f" path="/var/lib/kubelet/pods/ffa0c86e-66be-4f34-851f-4908eb22614f/volumes" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.305667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fqx6\" (UniqueName: \"kubernetes.io/projected/fa6d4aa3-52da-4273-8dae-1ac01656cab9-kube-api-access-5fqx6\") pod \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\" (UID: \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\") " Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.306819 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa6d4aa3-52da-4273-8dae-1ac01656cab9-operator-scripts\") pod \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\" (UID: \"fa6d4aa3-52da-4273-8dae-1ac01656cab9\") " Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.307200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6d4aa3-52da-4273-8dae-1ac01656cab9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa6d4aa3-52da-4273-8dae-1ac01656cab9" (UID: "fa6d4aa3-52da-4273-8dae-1ac01656cab9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.307403 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa6d4aa3-52da-4273-8dae-1ac01656cab9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.339920 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6d4aa3-52da-4273-8dae-1ac01656cab9-kube-api-access-5fqx6" (OuterVolumeSpecName: "kube-api-access-5fqx6") pod "fa6d4aa3-52da-4273-8dae-1ac01656cab9" (UID: "fa6d4aa3-52da-4273-8dae-1ac01656cab9"). InnerVolumeSpecName "kube-api-access-5fqx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.410003 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fqx6\" (UniqueName: \"kubernetes.io/projected/fa6d4aa3-52da-4273-8dae-1ac01656cab9-kube-api-access-5fqx6\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.734173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerStarted","Data":"6130ef269f25da98fe9198b930a9bec8bb7c3899ad8ba67f4ee79bf04371bf13"} Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.738365 4707 scope.go:117] "RemoveContainer" containerID="f3b690b9de0e54234e49a2ed96c2a83a738f962884b444cf049bd1c3a92afd51" Jan 29 03:47:33 crc kubenswrapper[4707]: E0129 03:47:33.738598 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6d64f59df8-kkdq9_openstack(0644f450-79f9-4f20-9476-e31d4b673507)\"" pod="openstack/heat-api-6d64f59df8-kkdq9" podUID="0644f450-79f9-4f20-9476-e31d4b673507" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.756582 4707 scope.go:117] "RemoveContainer" containerID="9d129b477285d290e6f40cc01822bebe25d7339c926a81fe299d63d985426379" Jan 29 03:47:33 crc kubenswrapper[4707]: E0129 03:47:33.757026 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5bf5cf4876-9r9d8_openstack(69cae86f-c7b4-4298-b9c5-6925a215df89)\"" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.764931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" event={"ID":"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64","Type":"ContainerDied","Data":"70cc4dc92905c522cc97958f33f06624733942d8d72dc869a668f031b28c26f7"} Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.764991 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70cc4dc92905c522cc97958f33f06624733942d8d72dc869a668f031b28c26f7" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.785959 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ptdvm" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.786048 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ptdvm" event={"ID":"fa6d4aa3-52da-4273-8dae-1ac01656cab9","Type":"ContainerDied","Data":"dd346bd17926df16b0b84354bffabb19afac469ffe4f13e072d0af1357d3e536"} Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.786101 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd346bd17926df16b0b84354bffabb19afac469ffe4f13e072d0af1357d3e536" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.906313 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.911141 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.922283 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:33 crc kubenswrapper[4707]: I0129 03:47:33.927111 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.031637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4264d1cc-366e-412c-9f2d-8276c6b5fc70-operator-scripts\") pod \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\" (UID: \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\") " Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.031720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mt4v\" (UniqueName: \"kubernetes.io/projected/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-kube-api-access-9mt4v\") pod \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\" (UID: \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\") " Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.031768 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snfw7\" (UniqueName: \"kubernetes.io/projected/4264d1cc-366e-412c-9f2d-8276c6b5fc70-kube-api-access-snfw7\") pod \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\" (UID: \"4264d1cc-366e-412c-9f2d-8276c6b5fc70\") " Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.031841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds4qg\" (UniqueName: \"kubernetes.io/projected/9b34d90a-4443-41fb-8220-d07465c9faa1-kube-api-access-ds4qg\") pod \"9b34d90a-4443-41fb-8220-d07465c9faa1\" (UID: \"9b34d90a-4443-41fb-8220-d07465c9faa1\") " Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.031915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-operator-scripts\") pod \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\" (UID: \"0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64\") " Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.032005 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d368b24-5c6c-4828-af36-7d553daeee3c-operator-scripts\") pod \"3d368b24-5c6c-4828-af36-7d553daeee3c\" (UID: \"3d368b24-5c6c-4828-af36-7d553daeee3c\") " Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.032036 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lrzs\" (UniqueName: \"kubernetes.io/projected/3d368b24-5c6c-4828-af36-7d553daeee3c-kube-api-access-9lrzs\") pod \"3d368b24-5c6c-4828-af36-7d553daeee3c\" (UID: \"3d368b24-5c6c-4828-af36-7d553daeee3c\") " Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.032071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b34d90a-4443-41fb-8220-d07465c9faa1-operator-scripts\") pod \"9b34d90a-4443-41fb-8220-d07465c9faa1\" (UID: \"9b34d90a-4443-41fb-8220-d07465c9faa1\") " Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.032147 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4264d1cc-366e-412c-9f2d-8276c6b5fc70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4264d1cc-366e-412c-9f2d-8276c6b5fc70" (UID: "4264d1cc-366e-412c-9f2d-8276c6b5fc70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.032681 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4264d1cc-366e-412c-9f2d-8276c6b5fc70-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.033182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d368b24-5c6c-4828-af36-7d553daeee3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d368b24-5c6c-4828-af36-7d553daeee3c" (UID: "3d368b24-5c6c-4828-af36-7d553daeee3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.033295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b34d90a-4443-41fb-8220-d07465c9faa1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b34d90a-4443-41fb-8220-d07465c9faa1" (UID: "9b34d90a-4443-41fb-8220-d07465c9faa1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.033597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64" (UID: "0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.038162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4264d1cc-366e-412c-9f2d-8276c6b5fc70-kube-api-access-snfw7" (OuterVolumeSpecName: "kube-api-access-snfw7") pod "4264d1cc-366e-412c-9f2d-8276c6b5fc70" (UID: "4264d1cc-366e-412c-9f2d-8276c6b5fc70"). InnerVolumeSpecName "kube-api-access-snfw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.043788 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b34d90a-4443-41fb-8220-d07465c9faa1-kube-api-access-ds4qg" (OuterVolumeSpecName: "kube-api-access-ds4qg") pod "9b34d90a-4443-41fb-8220-d07465c9faa1" (UID: "9b34d90a-4443-41fb-8220-d07465c9faa1"). InnerVolumeSpecName "kube-api-access-ds4qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.051329 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d368b24-5c6c-4828-af36-7d553daeee3c-kube-api-access-9lrzs" (OuterVolumeSpecName: "kube-api-access-9lrzs") pod "3d368b24-5c6c-4828-af36-7d553daeee3c" (UID: "3d368b24-5c6c-4828-af36-7d553daeee3c"). InnerVolumeSpecName "kube-api-access-9lrzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.053396 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-kube-api-access-9mt4v" (OuterVolumeSpecName: "kube-api-access-9mt4v") pod "0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64" (UID: "0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64"). InnerVolumeSpecName "kube-api-access-9mt4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.135158 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mt4v\" (UniqueName: \"kubernetes.io/projected/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-kube-api-access-9mt4v\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.135197 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snfw7\" (UniqueName: \"kubernetes.io/projected/4264d1cc-366e-412c-9f2d-8276c6b5fc70-kube-api-access-snfw7\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.135225 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds4qg\" (UniqueName: \"kubernetes.io/projected/9b34d90a-4443-41fb-8220-d07465c9faa1-kube-api-access-ds4qg\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.135236 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.135247 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d368b24-5c6c-4828-af36-7d553daeee3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.135257 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lrzs\" (UniqueName: \"kubernetes.io/projected/3d368b24-5c6c-4828-af36-7d553daeee3c-kube-api-access-9lrzs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.135266 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b34d90a-4443-41fb-8220-d07465c9faa1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.464005 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.798270 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.798293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aabf-account-create-update-xhfqq" event={"ID":"4264d1cc-366e-412c-9f2d-8276c6b5fc70","Type":"ContainerDied","Data":"72f4cef6b08383a7028b6fe064b13dff00a7a8364c3083261de59f745393a106"} Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.798824 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f4cef6b08383a7028b6fe064b13dff00a7a8364c3083261de59f745393a106" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.799842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lzdkh" event={"ID":"3d368b24-5c6c-4828-af36-7d553daeee3c","Type":"ContainerDied","Data":"81d52299c251b1cc855b32cc373aa836e250cba2a732532a887752967c3fc5ef"} Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.799898 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81d52299c251b1cc855b32cc373aa836e250cba2a732532a887752967c3fc5ef" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.799992 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lzdkh" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.801336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3f4f-account-create-update-jdxff" event={"ID":"9b34d90a-4443-41fb-8220-d07465c9faa1","Type":"ContainerDied","Data":"03bdeeab01e5e6819ec47ed688d2b5db9061320992ccfc9c0481e02f8798441f"} Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.801383 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03bdeeab01e5e6819ec47ed688d2b5db9061320992ccfc9c0481e02f8798441f" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.801394 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bbea-account-create-update-jcvzk" Jan 29 03:47:34 crc kubenswrapper[4707]: I0129 03:47:34.801418 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3f4f-account-create-update-jdxff" Jan 29 03:47:35 crc kubenswrapper[4707]: I0129 03:47:35.819239 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerStarted","Data":"bb3092f7be5ef0ecc0c3b06e54ca98bf2c046ff1aceaf0b3ba59d22f85e91c3c"} Jan 29 03:47:35 crc kubenswrapper[4707]: I0129 03:47:35.819560 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="ceilometer-central-agent" containerID="cri-o://3c21fc1241bd5eebf97d5b7510c834ea1cc4be65a2ceaccaa79b5f0807fad485" gracePeriod=30 Jan 29 03:47:35 crc kubenswrapper[4707]: I0129 03:47:35.819642 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="proxy-httpd" containerID="cri-o://bb3092f7be5ef0ecc0c3b06e54ca98bf2c046ff1aceaf0b3ba59d22f85e91c3c" gracePeriod=30 Jan 29 03:47:35 crc kubenswrapper[4707]: I0129 03:47:35.819642 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="sg-core" containerID="cri-o://6130ef269f25da98fe9198b930a9bec8bb7c3899ad8ba67f4ee79bf04371bf13" gracePeriod=30 Jan 29 03:47:35 crc kubenswrapper[4707]: I0129 03:47:35.819660 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="ceilometer-notification-agent" containerID="cri-o://f80a33cf4cedf71734034d60585189ea9bc2c9e7d57fbcab67e2c2708bf0ff03" gracePeriod=30 Jan 29 03:47:35 crc kubenswrapper[4707]: I0129 03:47:35.819806 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 03:47:35 crc kubenswrapper[4707]: I0129 03:47:35.842520 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=10.069741818 podStartE2EDuration="15.842496467s" podCreationTimestamp="2026-01-29 03:47:20 +0000 UTC" firstStartedPulling="2026-01-29 03:47:29.443494641 +0000 UTC m=+1202.927723546" lastFinishedPulling="2026-01-29 03:47:35.21624929 +0000 UTC m=+1208.700478195" observedRunningTime="2026-01-29 03:47:35.839499241 +0000 UTC m=+1209.323728156" watchObservedRunningTime="2026-01-29 03:47:35.842496467 +0000 UTC m=+1209.326725372" Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.038157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.038688 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.039761 4707 scope.go:117] "RemoveContainer" containerID="f3b690b9de0e54234e49a2ed96c2a83a738f962884b444cf049bd1c3a92afd51" Jan 29 03:47:36 crc kubenswrapper[4707]: E0129 03:47:36.040262 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6d64f59df8-kkdq9_openstack(0644f450-79f9-4f20-9476-e31d4b673507)\"" pod="openstack/heat-api-6d64f59df8-kkdq9" podUID="0644f450-79f9-4f20-9476-e31d4b673507" Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.073851 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.074191 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.075352 4707 scope.go:117] "RemoveContainer" containerID="9d129b477285d290e6f40cc01822bebe25d7339c926a81fe299d63d985426379" Jan 29 03:47:36 crc kubenswrapper[4707]: E0129 03:47:36.075660 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5bf5cf4876-9r9d8_openstack(69cae86f-c7b4-4298-b9c5-6925a215df89)\"" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.502141 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.832645 4707 generic.go:334] "Generic (PLEG): container finished" podID="e891f1aa-f142-469b-891f-01f8930884d3" containerID="bb3092f7be5ef0ecc0c3b06e54ca98bf2c046ff1aceaf0b3ba59d22f85e91c3c" exitCode=0 Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.832689 4707 generic.go:334] "Generic (PLEG): container finished" podID="e891f1aa-f142-469b-891f-01f8930884d3" containerID="6130ef269f25da98fe9198b930a9bec8bb7c3899ad8ba67f4ee79bf04371bf13" exitCode=2 Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.832707 4707 generic.go:334] "Generic (PLEG): container finished" podID="e891f1aa-f142-469b-891f-01f8930884d3" containerID="f80a33cf4cedf71734034d60585189ea9bc2c9e7d57fbcab67e2c2708bf0ff03" exitCode=0 Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.832718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerDied","Data":"bb3092f7be5ef0ecc0c3b06e54ca98bf2c046ff1aceaf0b3ba59d22f85e91c3c"} Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.832776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerDied","Data":"6130ef269f25da98fe9198b930a9bec8bb7c3899ad8ba67f4ee79bf04371bf13"} Jan 29 03:47:36 crc kubenswrapper[4707]: I0129 03:47:36.832796 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerDied","Data":"f80a33cf4cedf71734034d60585189ea9bc2c9e7d57fbcab67e2c2708bf0ff03"} Jan 29 03:47:37 crc kubenswrapper[4707]: I0129 03:47:37.857699 4707 generic.go:334] "Generic (PLEG): container finished" podID="e891f1aa-f142-469b-891f-01f8930884d3" containerID="3c21fc1241bd5eebf97d5b7510c834ea1cc4be65a2ceaccaa79b5f0807fad485" exitCode=0 Jan 29 03:47:37 crc kubenswrapper[4707]: I0129 03:47:37.858181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerDied","Data":"3c21fc1241bd5eebf97d5b7510c834ea1cc4be65a2ceaccaa79b5f0807fad485"} Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.277836 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.362470 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-combined-ca-bundle\") pod \"e891f1aa-f142-469b-891f-01f8930884d3\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.362575 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-run-httpd\") pod \"e891f1aa-f142-469b-891f-01f8930884d3\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.362603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-log-httpd\") pod \"e891f1aa-f142-469b-891f-01f8930884d3\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.362641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sjll\" (UniqueName: \"kubernetes.io/projected/e891f1aa-f142-469b-891f-01f8930884d3-kube-api-access-9sjll\") pod \"e891f1aa-f142-469b-891f-01f8930884d3\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.362903 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-config-data\") pod \"e891f1aa-f142-469b-891f-01f8930884d3\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.362964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-scripts\") pod \"e891f1aa-f142-469b-891f-01f8930884d3\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.363070 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-sg-core-conf-yaml\") pod \"e891f1aa-f142-469b-891f-01f8930884d3\" (UID: \"e891f1aa-f142-469b-891f-01f8930884d3\") " Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.364655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e891f1aa-f142-469b-891f-01f8930884d3" (UID: "e891f1aa-f142-469b-891f-01f8930884d3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.365789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e891f1aa-f142-469b-891f-01f8930884d3" (UID: "e891f1aa-f142-469b-891f-01f8930884d3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.374365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-scripts" (OuterVolumeSpecName: "scripts") pod "e891f1aa-f142-469b-891f-01f8930884d3" (UID: "e891f1aa-f142-469b-891f-01f8930884d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.385797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e891f1aa-f142-469b-891f-01f8930884d3-kube-api-access-9sjll" (OuterVolumeSpecName: "kube-api-access-9sjll") pod "e891f1aa-f142-469b-891f-01f8930884d3" (UID: "e891f1aa-f142-469b-891f-01f8930884d3"). InnerVolumeSpecName "kube-api-access-9sjll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.466395 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.466432 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e891f1aa-f142-469b-891f-01f8930884d3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.466443 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sjll\" (UniqueName: \"kubernetes.io/projected/e891f1aa-f142-469b-891f-01f8930884d3-kube-api-access-9sjll\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.466456 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.479146 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e891f1aa-f142-469b-891f-01f8930884d3" (UID: "e891f1aa-f142-469b-891f-01f8930884d3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.501027 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e891f1aa-f142-469b-891f-01f8930884d3" (UID: "e891f1aa-f142-469b-891f-01f8930884d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.535802 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-config-data" (OuterVolumeSpecName: "config-data") pod "e891f1aa-f142-469b-891f-01f8930884d3" (UID: "e891f1aa-f142-469b-891f-01f8930884d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.569759 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.569816 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.569831 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e891f1aa-f142-469b-891f-01f8930884d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.870966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e891f1aa-f142-469b-891f-01f8930884d3","Type":"ContainerDied","Data":"1f985a7694692100502ce2f7a23a87a2d110366f998da047683adac61c916492"} Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.871547 4707 scope.go:117] "RemoveContainer" containerID="bb3092f7be5ef0ecc0c3b06e54ca98bf2c046ff1aceaf0b3ba59d22f85e91c3c" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.871081 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.940836 4707 scope.go:117] "RemoveContainer" containerID="6130ef269f25da98fe9198b930a9bec8bb7c3899ad8ba67f4ee79bf04371bf13" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.942853 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.944864 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.975724 4707 scope.go:117] "RemoveContainer" containerID="f80a33cf4cedf71734034d60585189ea9bc2c9e7d57fbcab67e2c2708bf0ff03" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.975891 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976375 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="ceilometer-notification-agent" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976396 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="ceilometer-notification-agent" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976411 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6d4aa3-52da-4273-8dae-1ac01656cab9" containerName="mariadb-database-create" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976419 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6d4aa3-52da-4273-8dae-1ac01656cab9" containerName="mariadb-database-create" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976431 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" containerName="dnsmasq-dns" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976438 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" containerName="dnsmasq-dns" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976456 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerName="neutron-httpd" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976463 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerName="neutron-httpd" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976475 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b79725c-fee3-4f6f-84b7-c0dbd52e75ce" containerName="mariadb-database-create" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976482 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b79725c-fee3-4f6f-84b7-c0dbd52e75ce" containerName="mariadb-database-create" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976495 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa0c86e-66be-4f34-851f-4908eb22614f" containerName="heat-api" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976501 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa0c86e-66be-4f34-851f-4908eb22614f" containerName="heat-api" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976512 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="sg-core" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976518 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="sg-core" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976528 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b34d90a-4443-41fb-8220-d07465c9faa1" containerName="mariadb-account-create-update" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976590 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b34d90a-4443-41fb-8220-d07465c9faa1" containerName="mariadb-account-create-update" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976602 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64" containerName="mariadb-account-create-update" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976608 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64" containerName="mariadb-account-create-update" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976619 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="ceilometer-central-agent" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976627 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="ceilometer-central-agent" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976642 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="proxy-httpd" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976651 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="proxy-httpd" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976662 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerName="neutron-api" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976671 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerName="neutron-api" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976685 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4264d1cc-366e-412c-9f2d-8276c6b5fc70" containerName="mariadb-account-create-update" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976693 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4264d1cc-366e-412c-9f2d-8276c6b5fc70" containerName="mariadb-account-create-update" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976703 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d368b24-5c6c-4828-af36-7d553daeee3c" containerName="mariadb-database-create" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976709 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d368b24-5c6c-4828-af36-7d553daeee3c" containerName="mariadb-database-create" Jan 29 03:47:38 crc kubenswrapper[4707]: E0129 03:47:38.976719 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" containerName="init" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976725 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" containerName="init" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976921 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerName="neutron-api" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976932 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa0c86e-66be-4f34-851f-4908eb22614f" containerName="heat-api" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976942 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6d4aa3-52da-4273-8dae-1ac01656cab9" containerName="mariadb-database-create" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976955 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b34d90a-4443-41fb-8220-d07465c9faa1" containerName="mariadb-account-create-update" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976964 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8317cf-d24b-4fc7-a1d6-dd4f2881bb92" containerName="dnsmasq-dns" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976973 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70752bb-f7b2-4cd4-ace7-b64b837a8e95" containerName="neutron-httpd" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976982 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64" containerName="mariadb-account-create-update" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.976991 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="proxy-httpd" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.977000 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="ceilometer-notification-agent" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.977012 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="sg-core" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.977019 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d368b24-5c6c-4828-af36-7d553daeee3c" containerName="mariadb-database-create" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.977029 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e891f1aa-f142-469b-891f-01f8930884d3" containerName="ceilometer-central-agent" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.977042 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b79725c-fee3-4f6f-84b7-c0dbd52e75ce" containerName="mariadb-database-create" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.977050 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4264d1cc-366e-412c-9f2d-8276c6b5fc70" containerName="mariadb-account-create-update" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.978823 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.983092 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.983839 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 03:47:38 crc kubenswrapper[4707]: I0129 03:47:38.985057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.012162 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-78c45ff765-hw8sk" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.020406 4707 scope.go:117] "RemoveContainer" containerID="3c21fc1241bd5eebf97d5b7510c834ea1cc4be65a2ceaccaa79b5f0807fad485" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.084139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.084225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6xhm\" (UniqueName: \"kubernetes.io/projected/0ffdcc22-dfec-478d-a1b7-217b174ee80e-kube-api-access-q6xhm\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.084252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.084331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-log-httpd\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.084383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-config-data\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.084415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-run-httpd\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.084433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-scripts\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.108315 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5bf5cf4876-9r9d8"] Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.190207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-log-httpd\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.190288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-config-data\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.190340 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-run-httpd\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.190360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-scripts\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.190424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.190448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6xhm\" (UniqueName: \"kubernetes.io/projected/0ffdcc22-dfec-478d-a1b7-217b174ee80e-kube-api-access-q6xhm\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.190470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.191986 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-run-httpd\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.192267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-log-httpd\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.274147 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e891f1aa-f142-469b-891f-01f8930884d3" path="/var/lib/kubelet/pods/e891f1aa-f142-469b-891f-01f8930884d3/volumes" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.279412 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5bzlc"] Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.282343 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.290406 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.291139 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pkw2v" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.291270 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.314780 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6xhm\" (UniqueName: \"kubernetes.io/projected/0ffdcc22-dfec-478d-a1b7-217b174ee80e-kube-api-access-q6xhm\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.316251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-config-data\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.321068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-scripts\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.323130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.327633 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5bzlc"] Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.337893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.376715 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-74d4777d5f-4mj7v" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.399844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.399934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7k62\" (UniqueName: \"kubernetes.io/projected/e9e0c927-bc87-4eb0-b565-e30b4278331c-kube-api-access-v7k62\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.400028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-config-data\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.400209 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-scripts\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.451741 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6d64f59df8-kkdq9"] Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.524397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-scripts\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.524744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.524870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7k62\" (UniqueName: \"kubernetes.io/projected/e9e0c927-bc87-4eb0-b565-e30b4278331c-kube-api-access-v7k62\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.525051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-config-data\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.530842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-config-data\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.539750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-scripts\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.560390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7k62\" (UniqueName: \"kubernetes.io/projected/e9e0c927-bc87-4eb0-b565-e30b4278331c-kube-api-access-v7k62\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.560784 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5bzlc\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.613200 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.675556 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.730975 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.731667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data-custom\") pod \"69cae86f-c7b4-4298-b9c5-6925a215df89\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.731739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data\") pod \"69cae86f-c7b4-4298-b9c5-6925a215df89\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.731920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-combined-ca-bundle\") pod \"69cae86f-c7b4-4298-b9c5-6925a215df89\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.731982 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmqmt\" (UniqueName: \"kubernetes.io/projected/69cae86f-c7b4-4298-b9c5-6925a215df89-kube-api-access-tmqmt\") pod \"69cae86f-c7b4-4298-b9c5-6925a215df89\" (UID: \"69cae86f-c7b4-4298-b9c5-6925a215df89\") " Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.737611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69cae86f-c7b4-4298-b9c5-6925a215df89-kube-api-access-tmqmt" (OuterVolumeSpecName: "kube-api-access-tmqmt") pod "69cae86f-c7b4-4298-b9c5-6925a215df89" (UID: "69cae86f-c7b4-4298-b9c5-6925a215df89"). InnerVolumeSpecName "kube-api-access-tmqmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.745280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "69cae86f-c7b4-4298-b9c5-6925a215df89" (UID: "69cae86f-c7b4-4298-b9c5-6925a215df89"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.785867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69cae86f-c7b4-4298-b9c5-6925a215df89" (UID: "69cae86f-c7b4-4298-b9c5-6925a215df89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.835789 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.835823 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmqmt\" (UniqueName: \"kubernetes.io/projected/69cae86f-c7b4-4298-b9c5-6925a215df89-kube-api-access-tmqmt\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.836255 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.844029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data" (OuterVolumeSpecName: "config-data") pod "69cae86f-c7b4-4298-b9c5-6925a215df89" (UID: "69cae86f-c7b4-4298-b9c5-6925a215df89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.909013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" event={"ID":"69cae86f-c7b4-4298-b9c5-6925a215df89","Type":"ContainerDied","Data":"20cd6c50fc70370e1997b9d33c03aaebcd619cf28d1def9429db3b40fa1b797e"} Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.909107 4707 scope.go:117] "RemoveContainer" containerID="9d129b477285d290e6f40cc01822bebe25d7339c926a81fe299d63d985426379" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.909047 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5bf5cf4876-9r9d8" Jan 29 03:47:39 crc kubenswrapper[4707]: I0129 03:47:39.938573 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69cae86f-c7b4-4298-b9c5-6925a215df89-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.032530 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.038658 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5bf5cf4876-9r9d8"] Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.054598 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5bf5cf4876-9r9d8"] Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.162202 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8zxn\" (UniqueName: \"kubernetes.io/projected/0644f450-79f9-4f20-9476-e31d4b673507-kube-api-access-c8zxn\") pod \"0644f450-79f9-4f20-9476-e31d4b673507\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.163236 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data\") pod \"0644f450-79f9-4f20-9476-e31d4b673507\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.164147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data-custom\") pod \"0644f450-79f9-4f20-9476-e31d4b673507\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.184422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-combined-ca-bundle\") pod \"0644f450-79f9-4f20-9476-e31d4b673507\" (UID: \"0644f450-79f9-4f20-9476-e31d4b673507\") " Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.173116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0644f450-79f9-4f20-9476-e31d4b673507-kube-api-access-c8zxn" (OuterVolumeSpecName: "kube-api-access-c8zxn") pod "0644f450-79f9-4f20-9476-e31d4b673507" (UID: "0644f450-79f9-4f20-9476-e31d4b673507"). InnerVolumeSpecName "kube-api-access-c8zxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.195688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0644f450-79f9-4f20-9476-e31d4b673507" (UID: "0644f450-79f9-4f20-9476-e31d4b673507"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.239578 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.253192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0644f450-79f9-4f20-9476-e31d4b673507" (UID: "0644f450-79f9-4f20-9476-e31d4b673507"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.274488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data" (OuterVolumeSpecName: "config-data") pod "0644f450-79f9-4f20-9476-e31d4b673507" (UID: "0644f450-79f9-4f20-9476-e31d4b673507"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.289337 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.289381 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8zxn\" (UniqueName: \"kubernetes.io/projected/0644f450-79f9-4f20-9476-e31d4b673507-kube-api-access-c8zxn\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.289392 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.289403 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0644f450-79f9-4f20-9476-e31d4b673507-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:40 crc kubenswrapper[4707]: W0129 03:47:40.492993 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9e0c927_bc87_4eb0_b565_e30b4278331c.slice/crio-0ef1b96e5d7aade041802c4cf84c1b0b57dcd49a891bda87e421f5a52c69668a WatchSource:0}: Error finding container 0ef1b96e5d7aade041802c4cf84c1b0b57dcd49a891bda87e421f5a52c69668a: Status 404 returned error can't find the container with id 0ef1b96e5d7aade041802c4cf84c1b0b57dcd49a891bda87e421f5a52c69668a Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.498530 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5bzlc"] Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.717845 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.842091 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9f459ff7d-tkv2s" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.941425 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-54fd6b997b-lbt29"] Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.941752 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-54fd6b997b-lbt29" podUID="396ac802-02ea-480d-bd17-d18d64e8958f" containerName="placement-log" containerID="cri-o://b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398" gracePeriod=30 Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.942292 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-54fd6b997b-lbt29" podUID="396ac802-02ea-480d-bd17-d18d64e8958f" containerName="placement-api" containerID="cri-o://523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f" gracePeriod=30 Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.947981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerStarted","Data":"f3fd8fe74ecadf1292ad95b2188474fbc69493ef9f9c3c4403546132bcecbe15"} Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.948119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerStarted","Data":"9416567b0e29ca971b4076c89b7bccc583772008ef0c71f3201a596d2d128280"} Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.950069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5bzlc" event={"ID":"e9e0c927-bc87-4eb0-b565-e30b4278331c","Type":"ContainerStarted","Data":"0ef1b96e5d7aade041802c4cf84c1b0b57dcd49a891bda87e421f5a52c69668a"} Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.953273 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d64f59df8-kkdq9" Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.954794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d64f59df8-kkdq9" event={"ID":"0644f450-79f9-4f20-9476-e31d4b673507","Type":"ContainerDied","Data":"7dfa1052be66ce6b809419604907016c2068c17840a03451d083e3c33faad9f6"} Jan 29 03:47:40 crc kubenswrapper[4707]: I0129 03:47:40.954883 4707 scope.go:117] "RemoveContainer" containerID="f3b690b9de0e54234e49a2ed96c2a83a738f962884b444cf049bd1c3a92afd51" Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.018720 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6d64f59df8-kkdq9"] Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.049678 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6d64f59df8-kkdq9"] Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.109266 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7b6695978c-nxvfw" Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.177066 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-79896f576-thgpc"] Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.177390 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-79896f576-thgpc" podUID="35911d95-d558-41c9-9c2b-811b60410a49" containerName="heat-engine" containerID="cri-o://18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc" gracePeriod=60 Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.267966 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0644f450-79f9-4f20-9476-e31d4b673507" path="/var/lib/kubelet/pods/0644f450-79f9-4f20-9476-e31d4b673507/volumes" Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.268754 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" path="/var/lib/kubelet/pods/69cae86f-c7b4-4298-b9c5-6925a215df89/volumes" Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.975837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerStarted","Data":"3a427035625ca6abc64fbf3edddc23d3a22b87d86f4c534b56a3959f1deab348"} Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.985452 4707 generic.go:334] "Generic (PLEG): container finished" podID="396ac802-02ea-480d-bd17-d18d64e8958f" containerID="b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398" exitCode=143 Jan 29 03:47:41 crc kubenswrapper[4707]: I0129 03:47:41.985549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54fd6b997b-lbt29" event={"ID":"396ac802-02ea-480d-bd17-d18d64e8958f","Type":"ContainerDied","Data":"b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398"} Jan 29 03:47:43 crc kubenswrapper[4707]: I0129 03:47:43.008624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerStarted","Data":"a88106e0f751c9ae1b497337d8216d68dc1557b55222edd615918a3faf6077a4"} Jan 29 03:47:44 crc kubenswrapper[4707]: E0129 03:47:44.435423 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 03:47:44 crc kubenswrapper[4707]: E0129 03:47:44.437854 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 03:47:44 crc kubenswrapper[4707]: E0129 03:47:44.440025 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 03:47:44 crc kubenswrapper[4707]: E0129 03:47:44.440093 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-79896f576-thgpc" podUID="35911d95-d558-41c9-9c2b-811b60410a49" containerName="heat-engine" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.442321 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.639739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-config-data\") pod \"396ac802-02ea-480d-bd17-d18d64e8958f\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.639812 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-internal-tls-certs\") pod \"396ac802-02ea-480d-bd17-d18d64e8958f\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.640083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/396ac802-02ea-480d-bd17-d18d64e8958f-logs\") pod \"396ac802-02ea-480d-bd17-d18d64e8958f\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.640119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-scripts\") pod \"396ac802-02ea-480d-bd17-d18d64e8958f\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.640188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h8db\" (UniqueName: \"kubernetes.io/projected/396ac802-02ea-480d-bd17-d18d64e8958f-kube-api-access-2h8db\") pod \"396ac802-02ea-480d-bd17-d18d64e8958f\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.640231 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-combined-ca-bundle\") pod \"396ac802-02ea-480d-bd17-d18d64e8958f\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.640290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-public-tls-certs\") pod \"396ac802-02ea-480d-bd17-d18d64e8958f\" (UID: \"396ac802-02ea-480d-bd17-d18d64e8958f\") " Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.640997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396ac802-02ea-480d-bd17-d18d64e8958f-logs" (OuterVolumeSpecName: "logs") pod "396ac802-02ea-480d-bd17-d18d64e8958f" (UID: "396ac802-02ea-480d-bd17-d18d64e8958f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.647015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396ac802-02ea-480d-bd17-d18d64e8958f-kube-api-access-2h8db" (OuterVolumeSpecName: "kube-api-access-2h8db") pod "396ac802-02ea-480d-bd17-d18d64e8958f" (UID: "396ac802-02ea-480d-bd17-d18d64e8958f"). InnerVolumeSpecName "kube-api-access-2h8db". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.647041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-scripts" (OuterVolumeSpecName: "scripts") pod "396ac802-02ea-480d-bd17-d18d64e8958f" (UID: "396ac802-02ea-480d-bd17-d18d64e8958f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.713413 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-config-data" (OuterVolumeSpecName: "config-data") pod "396ac802-02ea-480d-bd17-d18d64e8958f" (UID: "396ac802-02ea-480d-bd17-d18d64e8958f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.715017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "396ac802-02ea-480d-bd17-d18d64e8958f" (UID: "396ac802-02ea-480d-bd17-d18d64e8958f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.743387 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/396ac802-02ea-480d-bd17-d18d64e8958f-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.743427 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.743441 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h8db\" (UniqueName: \"kubernetes.io/projected/396ac802-02ea-480d-bd17-d18d64e8958f-kube-api-access-2h8db\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.743462 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.743475 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.762821 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "396ac802-02ea-480d-bd17-d18d64e8958f" (UID: "396ac802-02ea-480d-bd17-d18d64e8958f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.793119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "396ac802-02ea-480d-bd17-d18d64e8958f" (UID: "396ac802-02ea-480d-bd17-d18d64e8958f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.845809 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:44 crc kubenswrapper[4707]: I0129 03:47:44.845853 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/396ac802-02ea-480d-bd17-d18d64e8958f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.094063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerStarted","Data":"eae999602fdb4272562c1b24fb7e3468e84238e6bb363729ced094440c6257d9"} Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.094400 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.102241 4707 generic.go:334] "Generic (PLEG): container finished" podID="396ac802-02ea-480d-bd17-d18d64e8958f" containerID="523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f" exitCode=0 Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.102366 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54fd6b997b-lbt29" event={"ID":"396ac802-02ea-480d-bd17-d18d64e8958f","Type":"ContainerDied","Data":"523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f"} Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.102409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54fd6b997b-lbt29" event={"ID":"396ac802-02ea-480d-bd17-d18d64e8958f","Type":"ContainerDied","Data":"ee3314c54a0165b6ed87367e46d3e531aa594c1d5a33502479307545c5317161"} Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.102469 4707 scope.go:117] "RemoveContainer" containerID="523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f" Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.103026 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54fd6b997b-lbt29" Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.128583 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.468528213 podStartE2EDuration="7.128557185s" podCreationTimestamp="2026-01-29 03:47:38 +0000 UTC" firstStartedPulling="2026-01-29 03:47:40.229119553 +0000 UTC m=+1213.713348458" lastFinishedPulling="2026-01-29 03:47:43.889148525 +0000 UTC m=+1217.373377430" observedRunningTime="2026-01-29 03:47:45.124633493 +0000 UTC m=+1218.608862398" watchObservedRunningTime="2026-01-29 03:47:45.128557185 +0000 UTC m=+1218.612786090" Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.185609 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-54fd6b997b-lbt29"] Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.195397 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-54fd6b997b-lbt29"] Jan 29 03:47:45 crc kubenswrapper[4707]: I0129 03:47:45.262333 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396ac802-02ea-480d-bd17-d18d64e8958f" path="/var/lib/kubelet/pods/396ac802-02ea-480d-bd17-d18d64e8958f/volumes" Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.360747 4707 scope.go:117] "RemoveContainer" containerID="b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398" Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.432955 4707 scope.go:117] "RemoveContainer" containerID="523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f" Jan 29 03:47:51 crc kubenswrapper[4707]: E0129 03:47:51.435770 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f\": container with ID starting with 523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f not found: ID does not exist" containerID="523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f" Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.435831 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f"} err="failed to get container status \"523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f\": rpc error: code = NotFound desc = could not find container \"523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f\": container with ID starting with 523ac4d76b2da0116fbd53eba0124de16c075a94276df02ae035f2ea2773480f not found: ID does not exist" Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.435860 4707 scope.go:117] "RemoveContainer" containerID="b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398" Jan 29 03:47:51 crc kubenswrapper[4707]: E0129 03:47:51.436896 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398\": container with ID starting with b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398 not found: ID does not exist" containerID="b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398" Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.436965 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398"} err="failed to get container status \"b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398\": rpc error: code = NotFound desc = could not find container \"b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398\": container with ID starting with b47f69e1aa2dfaf150969fb510ad72b2834ae2fb47d37a024089b94a5a425398 not found: ID does not exist" Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.440689 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.440965 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-log" containerID="cri-o://9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b" gracePeriod=30 Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.441098 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-httpd" containerID="cri-o://ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48" gracePeriod=30 Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.789897 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.790529 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="ceilometer-central-agent" containerID="cri-o://f3fd8fe74ecadf1292ad95b2188474fbc69493ef9f9c3c4403546132bcecbe15" gracePeriod=30 Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.790638 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="proxy-httpd" containerID="cri-o://eae999602fdb4272562c1b24fb7e3468e84238e6bb363729ced094440c6257d9" gracePeriod=30 Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.790686 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="sg-core" containerID="cri-o://a88106e0f751c9ae1b497337d8216d68dc1557b55222edd615918a3faf6077a4" gracePeriod=30 Jan 29 03:47:51 crc kubenswrapper[4707]: I0129 03:47:51.790899 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="ceilometer-notification-agent" containerID="cri-o://3a427035625ca6abc64fbf3edddc23d3a22b87d86f4c534b56a3959f1deab348" gracePeriod=30 Jan 29 03:47:52 crc kubenswrapper[4707]: I0129 03:47:52.178177 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0871cd7-6629-480c-801d-73c00a747882" containerID="9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b" exitCode=143 Jan 29 03:47:52 crc kubenswrapper[4707]: I0129 03:47:52.178287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0871cd7-6629-480c-801d-73c00a747882","Type":"ContainerDied","Data":"9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b"} Jan 29 03:47:52 crc kubenswrapper[4707]: I0129 03:47:52.182221 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerID="eae999602fdb4272562c1b24fb7e3468e84238e6bb363729ced094440c6257d9" exitCode=0 Jan 29 03:47:52 crc kubenswrapper[4707]: I0129 03:47:52.182260 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerID="a88106e0f751c9ae1b497337d8216d68dc1557b55222edd615918a3faf6077a4" exitCode=2 Jan 29 03:47:52 crc kubenswrapper[4707]: I0129 03:47:52.182304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerDied","Data":"eae999602fdb4272562c1b24fb7e3468e84238e6bb363729ced094440c6257d9"} Jan 29 03:47:52 crc kubenswrapper[4707]: I0129 03:47:52.182373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerDied","Data":"a88106e0f751c9ae1b497337d8216d68dc1557b55222edd615918a3faf6077a4"} Jan 29 03:47:52 crc kubenswrapper[4707]: I0129 03:47:52.185145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5bzlc" event={"ID":"e9e0c927-bc87-4eb0-b565-e30b4278331c","Type":"ContainerStarted","Data":"86f6e194922acfc714150c39a7ebbecba8151b690f5517f5208282a9daf9d67c"} Jan 29 03:47:52 crc kubenswrapper[4707]: I0129 03:47:52.202605 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5bzlc" podStartSLOduration=2.251175679 podStartE2EDuration="13.202587036s" podCreationTimestamp="2026-01-29 03:47:39 +0000 UTC" firstStartedPulling="2026-01-29 03:47:40.495985268 +0000 UTC m=+1213.980214173" lastFinishedPulling="2026-01-29 03:47:51.447396625 +0000 UTC m=+1224.931625530" observedRunningTime="2026-01-29 03:47:52.199397785 +0000 UTC m=+1225.683626680" watchObservedRunningTime="2026-01-29 03:47:52.202587036 +0000 UTC m=+1225.686815961" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.059728 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.060316 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-log" containerID="cri-o://55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d" gracePeriod=30 Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.060835 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-httpd" containerID="cri-o://ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e" gracePeriod=30 Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.237939 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerID="3a427035625ca6abc64fbf3edddc23d3a22b87d86f4c534b56a3959f1deab348" exitCode=0 Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.237978 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerID="f3fd8fe74ecadf1292ad95b2188474fbc69493ef9f9c3c4403546132bcecbe15" exitCode=0 Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.239269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerDied","Data":"3a427035625ca6abc64fbf3edddc23d3a22b87d86f4c534b56a3959f1deab348"} Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.239345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerDied","Data":"f3fd8fe74ecadf1292ad95b2188474fbc69493ef9f9c3c4403546132bcecbe15"} Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.473559 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.561249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-scripts\") pod \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.561345 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-config-data\") pod \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.561528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-sg-core-conf-yaml\") pod \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.561580 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-run-httpd\") pod \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.561628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-combined-ca-bundle\") pod \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.561758 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6xhm\" (UniqueName: \"kubernetes.io/projected/0ffdcc22-dfec-478d-a1b7-217b174ee80e-kube-api-access-q6xhm\") pod \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.561836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-log-httpd\") pod \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\" (UID: \"0ffdcc22-dfec-478d-a1b7-217b174ee80e\") " Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.562597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0ffdcc22-dfec-478d-a1b7-217b174ee80e" (UID: "0ffdcc22-dfec-478d-a1b7-217b174ee80e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.568997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0ffdcc22-dfec-478d-a1b7-217b174ee80e" (UID: "0ffdcc22-dfec-478d-a1b7-217b174ee80e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.583840 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffdcc22-dfec-478d-a1b7-217b174ee80e-kube-api-access-q6xhm" (OuterVolumeSpecName: "kube-api-access-q6xhm") pod "0ffdcc22-dfec-478d-a1b7-217b174ee80e" (UID: "0ffdcc22-dfec-478d-a1b7-217b174ee80e"). InnerVolumeSpecName "kube-api-access-q6xhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.591510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-scripts" (OuterVolumeSpecName: "scripts") pod "0ffdcc22-dfec-478d-a1b7-217b174ee80e" (UID: "0ffdcc22-dfec-478d-a1b7-217b174ee80e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.631679 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0ffdcc22-dfec-478d-a1b7-217b174ee80e" (UID: "0ffdcc22-dfec-478d-a1b7-217b174ee80e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.666038 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.666079 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.666089 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6xhm\" (UniqueName: \"kubernetes.io/projected/0ffdcc22-dfec-478d-a1b7-217b174ee80e-kube-api-access-q6xhm\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.666101 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ffdcc22-dfec-478d-a1b7-217b174ee80e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.666110 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.806811 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ffdcc22-dfec-478d-a1b7-217b174ee80e" (UID: "0ffdcc22-dfec-478d-a1b7-217b174ee80e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.829747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-config-data" (OuterVolumeSpecName: "config-data") pod "0ffdcc22-dfec-478d-a1b7-217b174ee80e" (UID: "0ffdcc22-dfec-478d-a1b7-217b174ee80e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.874391 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:53 crc kubenswrapper[4707]: I0129 03:47:53.874635 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ffdcc22-dfec-478d-a1b7-217b174ee80e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.248671 4707 generic.go:334] "Generic (PLEG): container finished" podID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerID="55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d" exitCode=143 Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.248758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"434a08d3-ec01-45a9-9b61-ceb740c82fa0","Type":"ContainerDied","Data":"55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d"} Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.253420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ffdcc22-dfec-478d-a1b7-217b174ee80e","Type":"ContainerDied","Data":"9416567b0e29ca971b4076c89b7bccc583772008ef0c71f3201a596d2d128280"} Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.253480 4707 scope.go:117] "RemoveContainer" containerID="eae999602fdb4272562c1b24fb7e3468e84238e6bb363729ced094440c6257d9" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.253639 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.298172 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.299790 4707 scope.go:117] "RemoveContainer" containerID="a88106e0f751c9ae1b497337d8216d68dc1557b55222edd615918a3faf6077a4" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.305866 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.332410 4707 scope.go:117] "RemoveContainer" containerID="3a427035625ca6abc64fbf3edddc23d3a22b87d86f4c534b56a3959f1deab348" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.370911 4707 scope.go:117] "RemoveContainer" containerID="f3fd8fe74ecadf1292ad95b2188474fbc69493ef9f9c3c4403546132bcecbe15" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.386439 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.387427 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0644f450-79f9-4f20-9476-e31d4b673507" containerName="heat-api" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.387469 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0644f450-79f9-4f20-9476-e31d4b673507" containerName="heat-api" Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.387505 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="ceilometer-notification-agent" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.387513 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="ceilometer-notification-agent" Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.387522 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" containerName="heat-cfnapi" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.387560 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" containerName="heat-cfnapi" Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.387579 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="sg-core" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.387585 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="sg-core" Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.387594 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396ac802-02ea-480d-bd17-d18d64e8958f" containerName="placement-log" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.387600 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="396ac802-02ea-480d-bd17-d18d64e8958f" containerName="placement-log" Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.387608 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396ac802-02ea-480d-bd17-d18d64e8958f" containerName="placement-api" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.387638 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="396ac802-02ea-480d-bd17-d18d64e8958f" containerName="placement-api" Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.387652 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="ceilometer-central-agent" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.387658 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="ceilometer-central-agent" Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.387675 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="proxy-httpd" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.387682 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="proxy-httpd" Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.387717 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" containerName="heat-cfnapi" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.387724 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" containerName="heat-cfnapi" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388146 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="ceilometer-notification-agent" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388171 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="sg-core" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388183 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0644f450-79f9-4f20-9476-e31d4b673507" containerName="heat-api" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388195 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="396ac802-02ea-480d-bd17-d18d64e8958f" containerName="placement-log" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388212 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" containerName="heat-cfnapi" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388225 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cae86f-c7b4-4298-b9c5-6925a215df89" containerName="heat-cfnapi" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388239 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0644f450-79f9-4f20-9476-e31d4b673507" containerName="heat-api" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388257 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="396ac802-02ea-480d-bd17-d18d64e8958f" containerName="placement-api" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388276 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="ceilometer-central-agent" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388293 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" containerName="proxy-httpd" Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.388731 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0644f450-79f9-4f20-9476-e31d4b673507" containerName="heat-api" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.388746 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0644f450-79f9-4f20-9476-e31d4b673507" containerName="heat-api" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.392113 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.396126 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.396585 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.434957 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.438143 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.439967 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.441318 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 29 03:47:54 crc kubenswrapper[4707]: E0129 03:47:54.441452 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-79896f576-thgpc" podUID="35911d95-d558-41c9-9c2b-811b60410a49" containerName="heat-engine" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.497434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.497502 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.497563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-config-data\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.497997 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxfpb\" (UniqueName: \"kubernetes.io/projected/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-kube-api-access-vxfpb\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.498096 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-log-httpd\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.498317 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-scripts\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.498473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-run-httpd\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.600290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.600359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.600399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-config-data\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.600471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxfpb\" (UniqueName: \"kubernetes.io/projected/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-kube-api-access-vxfpb\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.600502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-log-httpd\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.600532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-scripts\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.600577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-run-httpd\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.605006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-run-httpd\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.606805 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-log-httpd\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.609783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.610376 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-scripts\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.617586 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.622567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-config-data\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.628067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxfpb\" (UniqueName: \"kubernetes.io/projected/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-kube-api-access-vxfpb\") pod \"ceilometer-0\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " pod="openstack/ceilometer-0" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.640093 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": dial tcp 10.217.0.153:9292: connect: connection refused" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.640924 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": dial tcp 10.217.0.153:9292: connect: connection refused" Jan 29 03:47:54 crc kubenswrapper[4707]: I0129 03:47:54.719360 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.196383 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.273921 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffdcc22-dfec-478d-a1b7-217b174ee80e" path="/var/lib/kubelet/pods/0ffdcc22-dfec-478d-a1b7-217b174ee80e/volumes" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.283878 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0871cd7-6629-480c-801d-73c00a747882" containerID="ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48" exitCode=0 Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.283993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0871cd7-6629-480c-801d-73c00a747882","Type":"ContainerDied","Data":"ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48"} Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.284032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0871cd7-6629-480c-801d-73c00a747882","Type":"ContainerDied","Data":"246e10e7e0bacde856b7172c25d7915349b3c096ec8442acf8ed9f5af8ab4c5a"} Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.284077 4707 scope.go:117] "RemoveContainer" containerID="ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.284220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.288716 4707 generic.go:334] "Generic (PLEG): container finished" podID="35911d95-d558-41c9-9c2b-811b60410a49" containerID="18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc" exitCode=0 Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.288790 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79896f576-thgpc" event={"ID":"35911d95-d558-41c9-9c2b-811b60410a49","Type":"ContainerDied","Data":"18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc"} Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.323182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59mtm\" (UniqueName: \"kubernetes.io/projected/f0871cd7-6629-480c-801d-73c00a747882-kube-api-access-59mtm\") pod \"f0871cd7-6629-480c-801d-73c00a747882\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.323616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-config-data\") pod \"f0871cd7-6629-480c-801d-73c00a747882\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.323639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-public-tls-certs\") pod \"f0871cd7-6629-480c-801d-73c00a747882\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.323689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-httpd-run\") pod \"f0871cd7-6629-480c-801d-73c00a747882\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.323779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-combined-ca-bundle\") pod \"f0871cd7-6629-480c-801d-73c00a747882\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.323817 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-scripts\") pod \"f0871cd7-6629-480c-801d-73c00a747882\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.323850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-logs\") pod \"f0871cd7-6629-480c-801d-73c00a747882\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.323960 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"f0871cd7-6629-480c-801d-73c00a747882\" (UID: \"f0871cd7-6629-480c-801d-73c00a747882\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.327507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f0871cd7-6629-480c-801d-73c00a747882" (UID: "f0871cd7-6629-480c-801d-73c00a747882"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.327850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-logs" (OuterVolumeSpecName: "logs") pod "f0871cd7-6629-480c-801d-73c00a747882" (UID: "f0871cd7-6629-480c-801d-73c00a747882"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.333740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0871cd7-6629-480c-801d-73c00a747882-kube-api-access-59mtm" (OuterVolumeSpecName: "kube-api-access-59mtm") pod "f0871cd7-6629-480c-801d-73c00a747882" (UID: "f0871cd7-6629-480c-801d-73c00a747882"). InnerVolumeSpecName "kube-api-access-59mtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.333907 4707 scope.go:117] "RemoveContainer" containerID="9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.334044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "f0871cd7-6629-480c-801d-73c00a747882" (UID: "f0871cd7-6629-480c-801d-73c00a747882"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.336613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-scripts" (OuterVolumeSpecName: "scripts") pod "f0871cd7-6629-480c-801d-73c00a747882" (UID: "f0871cd7-6629-480c-801d-73c00a747882"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.400724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f0871cd7-6629-480c-801d-73c00a747882" (UID: "f0871cd7-6629-480c-801d-73c00a747882"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.409409 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.421789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0871cd7-6629-480c-801d-73c00a747882" (UID: "f0871cd7-6629-480c-801d-73c00a747882"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.423335 4707 scope.go:117] "RemoveContainer" containerID="ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48" Jan 29 03:47:55 crc kubenswrapper[4707]: E0129 03:47:55.430846 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48\": container with ID starting with ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48 not found: ID does not exist" containerID="ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.430894 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48"} err="failed to get container status \"ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48\": rpc error: code = NotFound desc = could not find container \"ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48\": container with ID starting with ff12186335ea99a8654c3e77ffc77f7f1bbf7214481368e5481ea90bfd021f48 not found: ID does not exist" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.430925 4707 scope.go:117] "RemoveContainer" containerID="9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b" Jan 29 03:47:55 crc kubenswrapper[4707]: E0129 03:47:55.432792 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b\": container with ID starting with 9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b not found: ID does not exist" containerID="9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.432830 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b"} err="failed to get container status \"9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b\": rpc error: code = NotFound desc = could not find container \"9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b\": container with ID starting with 9537a1a8f21dd69ca6ee570e24d44f955bd8eeeca41c533b2f43931adafe998b not found: ID does not exist" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.434427 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.434453 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59mtm\" (UniqueName: \"kubernetes.io/projected/f0871cd7-6629-480c-801d-73c00a747882-kube-api-access-59mtm\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.434465 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.434474 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.434483 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.434492 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.434500 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0871cd7-6629-480c-801d-73c00a747882-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.457465 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-config-data" (OuterVolumeSpecName: "config-data") pod "f0871cd7-6629-480c-801d-73c00a747882" (UID: "f0871cd7-6629-480c-801d-73c00a747882"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.476056 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.493434 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.536497 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.536558 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0871cd7-6629-480c-801d-73c00a747882-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.637682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data-custom\") pod \"35911d95-d558-41c9-9c2b-811b60410a49\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.637931 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s27xr\" (UniqueName: \"kubernetes.io/projected/35911d95-d558-41c9-9c2b-811b60410a49-kube-api-access-s27xr\") pod \"35911d95-d558-41c9-9c2b-811b60410a49\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.638012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data\") pod \"35911d95-d558-41c9-9c2b-811b60410a49\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.638113 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-combined-ca-bundle\") pod \"35911d95-d558-41c9-9c2b-811b60410a49\" (UID: \"35911d95-d558-41c9-9c2b-811b60410a49\") " Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.643066 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "35911d95-d558-41c9-9c2b-811b60410a49" (UID: "35911d95-d558-41c9-9c2b-811b60410a49"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.644587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35911d95-d558-41c9-9c2b-811b60410a49-kube-api-access-s27xr" (OuterVolumeSpecName: "kube-api-access-s27xr") pod "35911d95-d558-41c9-9c2b-811b60410a49" (UID: "35911d95-d558-41c9-9c2b-811b60410a49"). InnerVolumeSpecName "kube-api-access-s27xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.668688 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35911d95-d558-41c9-9c2b-811b60410a49" (UID: "35911d95-d558-41c9-9c2b-811b60410a49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.669059 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.698824 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.706584 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:47:55 crc kubenswrapper[4707]: E0129 03:47:55.707196 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35911d95-d558-41c9-9c2b-811b60410a49" containerName="heat-engine" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.707217 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35911d95-d558-41c9-9c2b-811b60410a49" containerName="heat-engine" Jan 29 03:47:55 crc kubenswrapper[4707]: E0129 03:47:55.707240 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-log" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.707249 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-log" Jan 29 03:47:55 crc kubenswrapper[4707]: E0129 03:47:55.707270 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-httpd" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.707278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-httpd" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.707507 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-httpd" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.707530 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35911d95-d558-41c9-9c2b-811b60410a49" containerName="heat-engine" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.707566 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0871cd7-6629-480c-801d-73c00a747882" containerName="glance-log" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.708848 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.711926 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.720123 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.721040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data" (OuterVolumeSpecName: "config-data") pod "35911d95-d558-41c9-9c2b-811b60410a49" (UID: "35911d95-d558-41c9-9c2b-811b60410a49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.722619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.741837 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s27xr\" (UniqueName: \"kubernetes.io/projected/35911d95-d558-41c9-9c2b-811b60410a49-kube-api-access-s27xr\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.741872 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.741884 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.741893 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35911d95-d558-41c9-9c2b-811b60410a49-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.843385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.843476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-scripts\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.843516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvs5j\" (UniqueName: \"kubernetes.io/projected/46ce0794-979b-4f4c-9a41-b895bbc25d0c-kube-api-access-gvs5j\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.843558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46ce0794-979b-4f4c-9a41-b895bbc25d0c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.843585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.843611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-config-data\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.843641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.843683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46ce0794-979b-4f4c-9a41-b895bbc25d0c-logs\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.946318 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.946518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-scripts\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.947597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvs5j\" (UniqueName: \"kubernetes.io/projected/46ce0794-979b-4f4c-9a41-b895bbc25d0c-kube-api-access-gvs5j\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.947634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46ce0794-979b-4f4c-9a41-b895bbc25d0c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.947118 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.947665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.947942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-config-data\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.947993 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.948128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46ce0794-979b-4f4c-9a41-b895bbc25d0c-logs\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.948600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46ce0794-979b-4f4c-9a41-b895bbc25d0c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.949002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46ce0794-979b-4f4c-9a41-b895bbc25d0c-logs\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.952082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-scripts\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.952961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.953348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.954283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46ce0794-979b-4f4c-9a41-b895bbc25d0c-config-data\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.967858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvs5j\" (UniqueName: \"kubernetes.io/projected/46ce0794-979b-4f4c-9a41-b895bbc25d0c-kube-api-access-gvs5j\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:55 crc kubenswrapper[4707]: I0129 03:47:55.992053 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"46ce0794-979b-4f4c-9a41-b895bbc25d0c\") " pod="openstack/glance-default-external-api-0" Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.105475 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.276174 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:32982->10.217.0.154:9292: read: connection reset by peer" Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.276394 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:32992->10.217.0.154:9292: read: connection reset by peer" Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.317229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerStarted","Data":"c1e5e92878eb2a2340e520bc71d2aaa48a2dd840d2225d9bd2bc6131010e15df"} Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.317319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerStarted","Data":"bf91bb8f8bf8fd2a6c43b1a61c32575ccd2a4b87bfeaacb65d8fc50619bcb5ca"} Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.323281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79896f576-thgpc" event={"ID":"35911d95-d558-41c9-9c2b-811b60410a49","Type":"ContainerDied","Data":"82bf2c358008dfeee50ff6eddf66028970320bfb989ec05987614d805393db10"} Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.323326 4707 scope.go:117] "RemoveContainer" containerID="18aa58b674ec3ea350336b365055a3ab9e29644363c98c0c4261d2f4fb2130bc" Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.323376 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79896f576-thgpc" Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.431077 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-79896f576-thgpc"] Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.457873 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-79896f576-thgpc"] Jan 29 03:47:56 crc kubenswrapper[4707]: I0129 03:47:56.781405 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 03:47:56 crc kubenswrapper[4707]: W0129 03:47:56.802228 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46ce0794_979b_4f4c_9a41_b895bbc25d0c.slice/crio-11063838f5cf697a1850d8c810fd2f612e95ed48572794c769440ea0820ac5a6 WatchSource:0}: Error finding container 11063838f5cf697a1850d8c810fd2f612e95ed48572794c769440ea0820ac5a6: Status 404 returned error can't find the container with id 11063838f5cf697a1850d8c810fd2f612e95ed48572794c769440ea0820ac5a6 Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.024892 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.195892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-httpd-run\") pod \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.195988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.196080 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-scripts\") pod \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.196102 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr5td\" (UniqueName: \"kubernetes.io/projected/434a08d3-ec01-45a9-9b61-ceb740c82fa0-kube-api-access-tr5td\") pod \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.196147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-logs\") pod \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.196192 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-internal-tls-certs\") pod \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.196221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-config-data\") pod \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.196295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-combined-ca-bundle\") pod \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\" (UID: \"434a08d3-ec01-45a9-9b61-ceb740c82fa0\") " Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.196824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "434a08d3-ec01-45a9-9b61-ceb740c82fa0" (UID: "434a08d3-ec01-45a9-9b61-ceb740c82fa0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.209489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-logs" (OuterVolumeSpecName: "logs") pod "434a08d3-ec01-45a9-9b61-ceb740c82fa0" (UID: "434a08d3-ec01-45a9-9b61-ceb740c82fa0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.213808 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-scripts" (OuterVolumeSpecName: "scripts") pod "434a08d3-ec01-45a9-9b61-ceb740c82fa0" (UID: "434a08d3-ec01-45a9-9b61-ceb740c82fa0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.241364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434a08d3-ec01-45a9-9b61-ceb740c82fa0-kube-api-access-tr5td" (OuterVolumeSpecName: "kube-api-access-tr5td") pod "434a08d3-ec01-45a9-9b61-ceb740c82fa0" (UID: "434a08d3-ec01-45a9-9b61-ceb740c82fa0"). InnerVolumeSpecName "kube-api-access-tr5td". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.241377 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "434a08d3-ec01-45a9-9b61-ceb740c82fa0" (UID: "434a08d3-ec01-45a9-9b61-ceb740c82fa0"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.300270 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.300689 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.300701 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.300710 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr5td\" (UniqueName: \"kubernetes.io/projected/434a08d3-ec01-45a9-9b61-ceb740c82fa0-kube-api-access-tr5td\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.300721 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/434a08d3-ec01-45a9-9b61-ceb740c82fa0-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.306839 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "434a08d3-ec01-45a9-9b61-ceb740c82fa0" (UID: "434a08d3-ec01-45a9-9b61-ceb740c82fa0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.319916 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35911d95-d558-41c9-9c2b-811b60410a49" path="/var/lib/kubelet/pods/35911d95-d558-41c9-9c2b-811b60410a49/volumes" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.320758 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0871cd7-6629-480c-801d-73c00a747882" path="/var/lib/kubelet/pods/f0871cd7-6629-480c-801d-73c00a747882/volumes" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.343750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "434a08d3-ec01-45a9-9b61-ceb740c82fa0" (UID: "434a08d3-ec01-45a9-9b61-ceb740c82fa0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.366950 4707 generic.go:334] "Generic (PLEG): container finished" podID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerID="ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e" exitCode=0 Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.367025 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.367099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"434a08d3-ec01-45a9-9b61-ceb740c82fa0","Type":"ContainerDied","Data":"ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e"} Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.367150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"434a08d3-ec01-45a9-9b61-ceb740c82fa0","Type":"ContainerDied","Data":"98641fc038f4d11ecd6840619726c5c723108d99ef8c4b1b7b896f1caf350b50"} Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.367180 4707 scope.go:117] "RemoveContainer" containerID="ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.376256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"46ce0794-979b-4f4c-9a41-b895bbc25d0c","Type":"ContainerStarted","Data":"11063838f5cf697a1850d8c810fd2f612e95ed48572794c769440ea0820ac5a6"} Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.382268 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-config-data" (OuterVolumeSpecName: "config-data") pod "434a08d3-ec01-45a9-9b61-ceb740c82fa0" (UID: "434a08d3-ec01-45a9-9b61-ceb740c82fa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.384784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerStarted","Data":"87a2bc946cf7fdbe88c88c52243aaab67f855b15c45fbf77b64004209ab92506"} Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.404862 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.404901 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.404912 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/434a08d3-ec01-45a9-9b61-ceb740c82fa0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.414519 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.428728 4707 scope.go:117] "RemoveContainer" containerID="55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.499651 4707 scope.go:117] "RemoveContainer" containerID="ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e" Jan 29 03:47:57 crc kubenswrapper[4707]: E0129 03:47:57.500418 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e\": container with ID starting with ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e not found: ID does not exist" containerID="ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.500457 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e"} err="failed to get container status \"ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e\": rpc error: code = NotFound desc = could not find container \"ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e\": container with ID starting with ec4219920f08ee6ebcdb872c2460adaaa8eb8688bb20b5fd16e185061f1dc14e not found: ID does not exist" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.500481 4707 scope.go:117] "RemoveContainer" containerID="55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d" Jan 29 03:47:57 crc kubenswrapper[4707]: E0129 03:47:57.502587 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d\": container with ID starting with 55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d not found: ID does not exist" containerID="55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.502619 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d"} err="failed to get container status \"55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d\": rpc error: code = NotFound desc = could not find container \"55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d\": container with ID starting with 55161a1d74ea23be1931830f5d37bcc4dc4991906f39db5f6fc0b5db528cd63d not found: ID does not exist" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.507009 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.739784 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.768111 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.789017 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:47:57 crc kubenswrapper[4707]: E0129 03:47:57.789577 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-httpd" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.789596 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-httpd" Jan 29 03:47:57 crc kubenswrapper[4707]: E0129 03:47:57.789636 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-log" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.789643 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-log" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.790068 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-log" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.790080 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" containerName="glance-httpd" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.791115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.801848 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.801991 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.825914 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.919379 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.919651 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.919713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vphn\" (UniqueName: \"kubernetes.io/projected/9a1690ce-5c45-4a23-abd5-a1521acd3f82-kube-api-access-4vphn\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.919751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a1690ce-5c45-4a23-abd5-a1521acd3f82-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.919786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a1690ce-5c45-4a23-abd5-a1521acd3f82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.919878 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.919909 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:57 crc kubenswrapper[4707]: I0129 03:47:57.920038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.021686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.021735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.021763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.021814 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.021866 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.021897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vphn\" (UniqueName: \"kubernetes.io/projected/9a1690ce-5c45-4a23-abd5-a1521acd3f82-kube-api-access-4vphn\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.021919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a1690ce-5c45-4a23-abd5-a1521acd3f82-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.021940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a1690ce-5c45-4a23-abd5-a1521acd3f82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.023187 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a1690ce-5c45-4a23-abd5-a1521acd3f82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.023433 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a1690ce-5c45-4a23-abd5-a1521acd3f82-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.024583 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.031677 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.034274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.040423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.041280 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a1690ce-5c45-4a23-abd5-a1521acd3f82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.054168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vphn\" (UniqueName: \"kubernetes.io/projected/9a1690ce-5c45-4a23-abd5-a1521acd3f82-kube-api-access-4vphn\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.082777 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a1690ce-5c45-4a23-abd5-a1521acd3f82\") " pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.144285 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.488388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"46ce0794-979b-4f4c-9a41-b895bbc25d0c","Type":"ContainerStarted","Data":"52d9447a7c7badd2d3571c5b7794f2fefc41811747cc16560861367669beb2ce"} Jan 29 03:47:58 crc kubenswrapper[4707]: I0129 03:47:58.515876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerStarted","Data":"4739e659e3af3d1e16baa927871e6bea8ac09784cc4f793ab3b4153db277b4eb"} Jan 29 03:47:59 crc kubenswrapper[4707]: I0129 03:47:59.128959 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 03:47:59 crc kubenswrapper[4707]: W0129 03:47:59.145050 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a1690ce_5c45_4a23_abd5_a1521acd3f82.slice/crio-9aeb0eb82cff24e06c312497ae8ece2a9c753cceccc3ef17b0f66f8f81a48dea WatchSource:0}: Error finding container 9aeb0eb82cff24e06c312497ae8ece2a9c753cceccc3ef17b0f66f8f81a48dea: Status 404 returned error can't find the container with id 9aeb0eb82cff24e06c312497ae8ece2a9c753cceccc3ef17b0f66f8f81a48dea Jan 29 03:47:59 crc kubenswrapper[4707]: I0129 03:47:59.293119 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="434a08d3-ec01-45a9-9b61-ceb740c82fa0" path="/var/lib/kubelet/pods/434a08d3-ec01-45a9-9b61-ceb740c82fa0/volumes" Jan 29 03:47:59 crc kubenswrapper[4707]: I0129 03:47:59.535577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"46ce0794-979b-4f4c-9a41-b895bbc25d0c","Type":"ContainerStarted","Data":"1c912fc1b6e71b912849dfc6898f64b1b2fc67113b173ecae4a6a0bd7b81cbc2"} Jan 29 03:47:59 crc kubenswrapper[4707]: I0129 03:47:59.545127 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 03:47:59 crc kubenswrapper[4707]: I0129 03:47:59.549612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a1690ce-5c45-4a23-abd5-a1521acd3f82","Type":"ContainerStarted","Data":"9aeb0eb82cff24e06c312497ae8ece2a9c753cceccc3ef17b0f66f8f81a48dea"} Jan 29 03:47:59 crc kubenswrapper[4707]: I0129 03:47:59.556806 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.556783028 podStartE2EDuration="4.556783028s" podCreationTimestamp="2026-01-29 03:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:47:59.556628594 +0000 UTC m=+1233.040857539" watchObservedRunningTime="2026-01-29 03:47:59.556783028 +0000 UTC m=+1233.041011933" Jan 29 03:47:59 crc kubenswrapper[4707]: I0129 03:47:59.592434 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7194756199999999 podStartE2EDuration="5.592410746s" podCreationTimestamp="2026-01-29 03:47:54 +0000 UTC" firstStartedPulling="2026-01-29 03:47:55.43388867 +0000 UTC m=+1228.918117575" lastFinishedPulling="2026-01-29 03:47:59.306823796 +0000 UTC m=+1232.791052701" observedRunningTime="2026-01-29 03:47:59.589368429 +0000 UTC m=+1233.073597344" watchObservedRunningTime="2026-01-29 03:47:59.592410746 +0000 UTC m=+1233.076639651" Jan 29 03:48:00 crc kubenswrapper[4707]: I0129 03:48:00.562749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerStarted","Data":"d632e85024544d232a053ac7743aa63432b275a358b7487b042c9b5d2636eec4"} Jan 29 03:48:00 crc kubenswrapper[4707]: I0129 03:48:00.565612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a1690ce-5c45-4a23-abd5-a1521acd3f82","Type":"ContainerStarted","Data":"f375086be859cef66b986ce9029906cae636b82f0f693afac077aa3d80fb4f74"} Jan 29 03:48:00 crc kubenswrapper[4707]: I0129 03:48:00.565684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a1690ce-5c45-4a23-abd5-a1521acd3f82","Type":"ContainerStarted","Data":"382208d7874e141616e176b5987be4403806f450cf26d2d537e6475c17b4b05e"} Jan 29 03:48:00 crc kubenswrapper[4707]: I0129 03:48:00.595195 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.595166376 podStartE2EDuration="3.595166376s" podCreationTimestamp="2026-01-29 03:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:48:00.588265819 +0000 UTC m=+1234.072494724" watchObservedRunningTime="2026-01-29 03:48:00.595166376 +0000 UTC m=+1234.079395281" Jan 29 03:48:06 crc kubenswrapper[4707]: I0129 03:48:06.105692 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 03:48:06 crc kubenswrapper[4707]: I0129 03:48:06.106798 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 03:48:06 crc kubenswrapper[4707]: I0129 03:48:06.144644 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 03:48:06 crc kubenswrapper[4707]: I0129 03:48:06.178299 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 03:48:06 crc kubenswrapper[4707]: I0129 03:48:06.632780 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9e0c927-bc87-4eb0-b565-e30b4278331c" containerID="86f6e194922acfc714150c39a7ebbecba8151b690f5517f5208282a9daf9d67c" exitCode=0 Jan 29 03:48:06 crc kubenswrapper[4707]: I0129 03:48:06.635022 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5bzlc" event={"ID":"e9e0c927-bc87-4eb0-b565-e30b4278331c","Type":"ContainerDied","Data":"86f6e194922acfc714150c39a7ebbecba8151b690f5517f5208282a9daf9d67c"} Jan 29 03:48:06 crc kubenswrapper[4707]: I0129 03:48:06.635183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 03:48:06 crc kubenswrapper[4707]: I0129 03:48:06.635208 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.058457 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.127232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-combined-ca-bundle\") pod \"e9e0c927-bc87-4eb0-b565-e30b4278331c\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.127447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-scripts\") pod \"e9e0c927-bc87-4eb0-b565-e30b4278331c\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.127589 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7k62\" (UniqueName: \"kubernetes.io/projected/e9e0c927-bc87-4eb0-b565-e30b4278331c-kube-api-access-v7k62\") pod \"e9e0c927-bc87-4eb0-b565-e30b4278331c\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.127659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-config-data\") pod \"e9e0c927-bc87-4eb0-b565-e30b4278331c\" (UID: \"e9e0c927-bc87-4eb0-b565-e30b4278331c\") " Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.134494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-scripts" (OuterVolumeSpecName: "scripts") pod "e9e0c927-bc87-4eb0-b565-e30b4278331c" (UID: "e9e0c927-bc87-4eb0-b565-e30b4278331c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.144933 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.145201 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.152853 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e0c927-bc87-4eb0-b565-e30b4278331c-kube-api-access-v7k62" (OuterVolumeSpecName: "kube-api-access-v7k62") pod "e9e0c927-bc87-4eb0-b565-e30b4278331c" (UID: "e9e0c927-bc87-4eb0-b565-e30b4278331c"). InnerVolumeSpecName "kube-api-access-v7k62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.164640 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-config-data" (OuterVolumeSpecName: "config-data") pod "e9e0c927-bc87-4eb0-b565-e30b4278331c" (UID: "e9e0c927-bc87-4eb0-b565-e30b4278331c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.174897 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9e0c927-bc87-4eb0-b565-e30b4278331c" (UID: "e9e0c927-bc87-4eb0-b565-e30b4278331c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.179820 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.204426 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.230132 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7k62\" (UniqueName: \"kubernetes.io/projected/e9e0c927-bc87-4eb0-b565-e30b4278331c-kube-api-access-v7k62\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.230164 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.230177 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.230186 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9e0c927-bc87-4eb0-b565-e30b4278331c-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.655981 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.656668 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.656425 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5bzlc" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.656360 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5bzlc" event={"ID":"e9e0c927-bc87-4eb0-b565-e30b4278331c","Type":"ContainerDied","Data":"0ef1b96e5d7aade041802c4cf84c1b0b57dcd49a891bda87e421f5a52c69668a"} Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.658084 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ef1b96e5d7aade041802c4cf84c1b0b57dcd49a891bda87e421f5a52c69668a" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.658109 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.658123 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.723283 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.867184 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.897763 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:08 crc kubenswrapper[4707]: E0129 03:48:08.898179 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e0c927-bc87-4eb0-b565-e30b4278331c" containerName="nova-cell0-conductor-db-sync" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.898197 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e0c927-bc87-4eb0-b565-e30b4278331c" containerName="nova-cell0-conductor-db-sync" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.898402 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e0c927-bc87-4eb0-b565-e30b4278331c" containerName="nova-cell0-conductor-db-sync" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.899044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.911034 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.936559 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:08 crc kubenswrapper[4707]: I0129 03:48:08.943052 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pkw2v" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.057894 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.057980 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.058064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42z5w\" (UniqueName: \"kubernetes.io/projected/a3b94873-970a-4564-a6b4-1e07adee8365-kube-api-access-42z5w\") pod \"nova-cell0-conductor-0\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.159984 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.160365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.160463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42z5w\" (UniqueName: \"kubernetes.io/projected/a3b94873-970a-4564-a6b4-1e07adee8365-kube-api-access-42z5w\") pod \"nova-cell0-conductor-0\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.172592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.182531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42z5w\" (UniqueName: \"kubernetes.io/projected/a3b94873-970a-4564-a6b4-1e07adee8365-kube-api-access-42z5w\") pod \"nova-cell0-conductor-0\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.183675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.222209 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:09 crc kubenswrapper[4707]: I0129 03:48:09.758636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:09 crc kubenswrapper[4707]: W0129 03:48:09.766045 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3b94873_970a_4564_a6b4_1e07adee8365.slice/crio-5fb48353316bbd18524b28015230699493dbdb5f1cb3502d8c1a32849ea523b9 WatchSource:0}: Error finding container 5fb48353316bbd18524b28015230699493dbdb5f1cb3502d8c1a32849ea523b9: Status 404 returned error can't find the container with id 5fb48353316bbd18524b28015230699493dbdb5f1cb3502d8c1a32849ea523b9 Jan 29 03:48:10 crc kubenswrapper[4707]: I0129 03:48:10.676283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3b94873-970a-4564-a6b4-1e07adee8365","Type":"ContainerStarted","Data":"4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f"} Jan 29 03:48:10 crc kubenswrapper[4707]: I0129 03:48:10.676711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3b94873-970a-4564-a6b4-1e07adee8365","Type":"ContainerStarted","Data":"5fb48353316bbd18524b28015230699493dbdb5f1cb3502d8c1a32849ea523b9"} Jan 29 03:48:10 crc kubenswrapper[4707]: I0129 03:48:10.676715 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 03:48:10 crc kubenswrapper[4707]: I0129 03:48:10.676752 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 03:48:10 crc kubenswrapper[4707]: I0129 03:48:10.677375 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:10 crc kubenswrapper[4707]: I0129 03:48:10.703383 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.703361551 podStartE2EDuration="2.703361551s" podCreationTimestamp="2026-01-29 03:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:48:10.700006646 +0000 UTC m=+1244.184235551" watchObservedRunningTime="2026-01-29 03:48:10.703361551 +0000 UTC m=+1244.187590456" Jan 29 03:48:10 crc kubenswrapper[4707]: I0129 03:48:10.717078 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.072733 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.080214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.517645 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.518562 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="sg-core" containerID="cri-o://4739e659e3af3d1e16baa927871e6bea8ac09784cc4f793ab3b4153db277b4eb" gracePeriod=30 Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.518674 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="ceilometer-notification-agent" containerID="cri-o://87a2bc946cf7fdbe88c88c52243aaab67f855b15c45fbf77b64004209ab92506" gracePeriod=30 Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.518742 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="proxy-httpd" containerID="cri-o://d632e85024544d232a053ac7743aa63432b275a358b7487b042c9b5d2636eec4" gracePeriod=30 Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.518532 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="ceilometer-central-agent" containerID="cri-o://c1e5e92878eb2a2340e520bc71d2aaa48a2dd840d2225d9bd2bc6131010e15df" gracePeriod=30 Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.534053 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.196:3000/\": EOF" Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.688637 4707 generic.go:334] "Generic (PLEG): container finished" podID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerID="4739e659e3af3d1e16baa927871e6bea8ac09784cc4f793ab3b4153db277b4eb" exitCode=2 Jan 29 03:48:11 crc kubenswrapper[4707]: I0129 03:48:11.688707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerDied","Data":"4739e659e3af3d1e16baa927871e6bea8ac09784cc4f793ab3b4153db277b4eb"} Jan 29 03:48:12 crc kubenswrapper[4707]: I0129 03:48:12.703879 4707 generic.go:334] "Generic (PLEG): container finished" podID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerID="d632e85024544d232a053ac7743aa63432b275a358b7487b042c9b5d2636eec4" exitCode=0 Jan 29 03:48:12 crc kubenswrapper[4707]: I0129 03:48:12.705357 4707 generic.go:334] "Generic (PLEG): container finished" podID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerID="c1e5e92878eb2a2340e520bc71d2aaa48a2dd840d2225d9bd2bc6131010e15df" exitCode=0 Jan 29 03:48:12 crc kubenswrapper[4707]: I0129 03:48:12.703963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerDied","Data":"d632e85024544d232a053ac7743aa63432b275a358b7487b042c9b5d2636eec4"} Jan 29 03:48:12 crc kubenswrapper[4707]: I0129 03:48:12.705557 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerDied","Data":"c1e5e92878eb2a2340e520bc71d2aaa48a2dd840d2225d9bd2bc6131010e15df"} Jan 29 03:48:12 crc kubenswrapper[4707]: I0129 03:48:12.705704 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="a3b94873-970a-4564-a6b4-1e07adee8365" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f" gracePeriod=30 Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.634050 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.666669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-combined-ca-bundle\") pod \"a3b94873-970a-4564-a6b4-1e07adee8365\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.666928 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-config-data\") pod \"a3b94873-970a-4564-a6b4-1e07adee8365\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.667009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42z5w\" (UniqueName: \"kubernetes.io/projected/a3b94873-970a-4564-a6b4-1e07adee8365-kube-api-access-42z5w\") pod \"a3b94873-970a-4564-a6b4-1e07adee8365\" (UID: \"a3b94873-970a-4564-a6b4-1e07adee8365\") " Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.682723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b94873-970a-4564-a6b4-1e07adee8365-kube-api-access-42z5w" (OuterVolumeSpecName: "kube-api-access-42z5w") pod "a3b94873-970a-4564-a6b4-1e07adee8365" (UID: "a3b94873-970a-4564-a6b4-1e07adee8365"). InnerVolumeSpecName "kube-api-access-42z5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.722796 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-config-data" (OuterVolumeSpecName: "config-data") pod "a3b94873-970a-4564-a6b4-1e07adee8365" (UID: "a3b94873-970a-4564-a6b4-1e07adee8365"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.764996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3b94873-970a-4564-a6b4-1e07adee8365" (UID: "a3b94873-970a-4564-a6b4-1e07adee8365"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.768912 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.768935 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42z5w\" (UniqueName: \"kubernetes.io/projected/a3b94873-970a-4564-a6b4-1e07adee8365-kube-api-access-42z5w\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.768946 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b94873-970a-4564-a6b4-1e07adee8365-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.824058 4707 generic.go:334] "Generic (PLEG): container finished" podID="a3b94873-970a-4564-a6b4-1e07adee8365" containerID="4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f" exitCode=0 Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.824149 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3b94873-970a-4564-a6b4-1e07adee8365","Type":"ContainerDied","Data":"4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f"} Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.824204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3b94873-970a-4564-a6b4-1e07adee8365","Type":"ContainerDied","Data":"5fb48353316bbd18524b28015230699493dbdb5f1cb3502d8c1a32849ea523b9"} Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.824229 4707 scope.go:117] "RemoveContainer" containerID="4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.824531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.884605 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.895041 4707 scope.go:117] "RemoveContainer" containerID="4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f" Jan 29 03:48:13 crc kubenswrapper[4707]: E0129 03:48:13.895788 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f\": container with ID starting with 4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f not found: ID does not exist" containerID="4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.895825 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f"} err="failed to get container status \"4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f\": rpc error: code = NotFound desc = could not find container \"4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f\": container with ID starting with 4209945c624750f2cade2fbfc5943d5a5702b6f36fe934c41e9e7cd3aec9177f not found: ID does not exist" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.898625 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.915516 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:13 crc kubenswrapper[4707]: E0129 03:48:13.916128 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b94873-970a-4564-a6b4-1e07adee8365" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.916149 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b94873-970a-4564-a6b4-1e07adee8365" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.916312 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b94873-970a-4564-a6b4-1e07adee8365" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.917045 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.921997 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pkw2v" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.922225 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.928741 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.974985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.975047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:13 crc kubenswrapper[4707]: I0129 03:48:13.975127 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkvqm\" (UniqueName: \"kubernetes.io/projected/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-kube-api-access-jkvqm\") pod \"nova-cell0-conductor-0\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.077587 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkvqm\" (UniqueName: \"kubernetes.io/projected/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-kube-api-access-jkvqm\") pod \"nova-cell0-conductor-0\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.077719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.077788 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.082303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.082435 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.098850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkvqm\" (UniqueName: \"kubernetes.io/projected/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-kube-api-access-jkvqm\") pod \"nova-cell0-conductor-0\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.291460 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:14 crc kubenswrapper[4707]: W0129 03:48:14.767263 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b783a8_a9cd_43fe_adfc_ef70d2eff3bd.slice/crio-10656f5f1e3731a39c02de41339dedb6439499941671903a9f37431eed541bc3 WatchSource:0}: Error finding container 10656f5f1e3731a39c02de41339dedb6439499941671903a9f37431eed541bc3: Status 404 returned error can't find the container with id 10656f5f1e3731a39c02de41339dedb6439499941671903a9f37431eed541bc3 Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.768565 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.842227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd","Type":"ContainerStarted","Data":"10656f5f1e3731a39c02de41339dedb6439499941671903a9f37431eed541bc3"} Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.846143 4707 generic.go:334] "Generic (PLEG): container finished" podID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerID="87a2bc946cf7fdbe88c88c52243aaab67f855b15c45fbf77b64004209ab92506" exitCode=0 Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.846193 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerDied","Data":"87a2bc946cf7fdbe88c88c52243aaab67f855b15c45fbf77b64004209ab92506"} Jan 29 03:48:14 crc kubenswrapper[4707]: I0129 03:48:14.951877 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.001474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxfpb\" (UniqueName: \"kubernetes.io/projected/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-kube-api-access-vxfpb\") pod \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.001601 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-sg-core-conf-yaml\") pod \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.001699 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-log-httpd\") pod \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.001764 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-combined-ca-bundle\") pod \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.001819 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-scripts\") pod \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.001899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-config-data\") pod \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.001972 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-run-httpd\") pod \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\" (UID: \"e014a6b9-3ea1-4c50-a410-6cf81d5c6923\") " Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.002902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e014a6b9-3ea1-4c50-a410-6cf81d5c6923" (UID: "e014a6b9-3ea1-4c50-a410-6cf81d5c6923"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.008513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-kube-api-access-vxfpb" (OuterVolumeSpecName: "kube-api-access-vxfpb") pod "e014a6b9-3ea1-4c50-a410-6cf81d5c6923" (UID: "e014a6b9-3ea1-4c50-a410-6cf81d5c6923"). InnerVolumeSpecName "kube-api-access-vxfpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.009210 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e014a6b9-3ea1-4c50-a410-6cf81d5c6923" (UID: "e014a6b9-3ea1-4c50-a410-6cf81d5c6923"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.015957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-scripts" (OuterVolumeSpecName: "scripts") pod "e014a6b9-3ea1-4c50-a410-6cf81d5c6923" (UID: "e014a6b9-3ea1-4c50-a410-6cf81d5c6923"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.046635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e014a6b9-3ea1-4c50-a410-6cf81d5c6923" (UID: "e014a6b9-3ea1-4c50-a410-6cf81d5c6923"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.103632 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.103663 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxfpb\" (UniqueName: \"kubernetes.io/projected/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-kube-api-access-vxfpb\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.103675 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.103683 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.103693 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.107740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e014a6b9-3ea1-4c50-a410-6cf81d5c6923" (UID: "e014a6b9-3ea1-4c50-a410-6cf81d5c6923"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.125344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-config-data" (OuterVolumeSpecName: "config-data") pod "e014a6b9-3ea1-4c50-a410-6cf81d5c6923" (UID: "e014a6b9-3ea1-4c50-a410-6cf81d5c6923"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.205293 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.205632 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e014a6b9-3ea1-4c50-a410-6cf81d5c6923-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.255439 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b94873-970a-4564-a6b4-1e07adee8365" path="/var/lib/kubelet/pods/a3b94873-970a-4564-a6b4-1e07adee8365/volumes" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.859068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd","Type":"ContainerStarted","Data":"0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361"} Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.859495 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.863871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e014a6b9-3ea1-4c50-a410-6cf81d5c6923","Type":"ContainerDied","Data":"bf91bb8f8bf8fd2a6c43b1a61c32575ccd2a4b87bfeaacb65d8fc50619bcb5ca"} Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.863946 4707 scope.go:117] "RemoveContainer" containerID="d632e85024544d232a053ac7743aa63432b275a358b7487b042c9b5d2636eec4" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.864002 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.884761 4707 scope.go:117] "RemoveContainer" containerID="4739e659e3af3d1e16baa927871e6bea8ac09784cc4f793ab3b4153db277b4eb" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.886448 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.886424888 podStartE2EDuration="2.886424888s" podCreationTimestamp="2026-01-29 03:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:48:15.884933336 +0000 UTC m=+1249.369162241" watchObservedRunningTime="2026-01-29 03:48:15.886424888 +0000 UTC m=+1249.370653793" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.905087 4707 scope.go:117] "RemoveContainer" containerID="87a2bc946cf7fdbe88c88c52243aaab67f855b15c45fbf77b64004209ab92506" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.907918 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.920167 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.941751 4707 scope.go:117] "RemoveContainer" containerID="c1e5e92878eb2a2340e520bc71d2aaa48a2dd840d2225d9bd2bc6131010e15df" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.943259 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:15 crc kubenswrapper[4707]: E0129 03:48:15.943856 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="ceilometer-notification-agent" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.943881 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="ceilometer-notification-agent" Jan 29 03:48:15 crc kubenswrapper[4707]: E0129 03:48:15.943905 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="ceilometer-central-agent" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.943912 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="ceilometer-central-agent" Jan 29 03:48:15 crc kubenswrapper[4707]: E0129 03:48:15.943940 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="sg-core" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.943947 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="sg-core" Jan 29 03:48:15 crc kubenswrapper[4707]: E0129 03:48:15.943970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="proxy-httpd" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.943981 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="proxy-httpd" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.944204 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="ceilometer-notification-agent" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.944226 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="ceilometer-central-agent" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.944236 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="proxy-httpd" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.944250 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" containerName="sg-core" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.947056 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.953025 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.954127 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 03:48:15 crc kubenswrapper[4707]: I0129 03:48:15.955763 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.023525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk7kl\" (UniqueName: \"kubernetes.io/projected/977fec89-9e03-4b52-86aa-ec04969e7b45-kube-api-access-gk7kl\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.023620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-scripts\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.023674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-config-data\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.023708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-run-httpd\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.023989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.024092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-log-httpd\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.024136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.126592 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk7kl\" (UniqueName: \"kubernetes.io/projected/977fec89-9e03-4b52-86aa-ec04969e7b45-kube-api-access-gk7kl\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.126698 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-scripts\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.126753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-config-data\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.126796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-run-httpd\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.126881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.126909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-log-httpd\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.126934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.127592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-run-httpd\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.127953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-log-httpd\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.133700 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-config-data\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.134773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.140294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.142056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-scripts\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.144444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk7kl\" (UniqueName: \"kubernetes.io/projected/977fec89-9e03-4b52-86aa-ec04969e7b45-kube-api-access-gk7kl\") pod \"ceilometer-0\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.185164 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.285836 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.808396 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:16 crc kubenswrapper[4707]: W0129 03:48:16.809461 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod977fec89_9e03_4b52_86aa_ec04969e7b45.slice/crio-b83ad2074c06c03a563a0e6e16a12dbeb7f46b64504d4c6b8c611d53999e40fe WatchSource:0}: Error finding container b83ad2074c06c03a563a0e6e16a12dbeb7f46b64504d4c6b8c611d53999e40fe: Status 404 returned error can't find the container with id b83ad2074c06c03a563a0e6e16a12dbeb7f46b64504d4c6b8c611d53999e40fe Jan 29 03:48:16 crc kubenswrapper[4707]: I0129 03:48:16.881413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerStarted","Data":"b83ad2074c06c03a563a0e6e16a12dbeb7f46b64504d4c6b8c611d53999e40fe"} Jan 29 03:48:17 crc kubenswrapper[4707]: I0129 03:48:17.256176 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e014a6b9-3ea1-4c50-a410-6cf81d5c6923" path="/var/lib/kubelet/pods/e014a6b9-3ea1-4c50-a410-6cf81d5c6923/volumes" Jan 29 03:48:17 crc kubenswrapper[4707]: I0129 03:48:17.629767 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:17 crc kubenswrapper[4707]: I0129 03:48:17.899705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerStarted","Data":"84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1"} Jan 29 03:48:17 crc kubenswrapper[4707]: I0129 03:48:17.899801 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerName="nova-cell0-conductor-conductor" containerID="cri-o://0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" gracePeriod=30 Jan 29 03:48:18 crc kubenswrapper[4707]: I0129 03:48:18.911392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerStarted","Data":"2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8"} Jan 29 03:48:18 crc kubenswrapper[4707]: I0129 03:48:18.912256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerStarted","Data":"54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5"} Jan 29 03:48:21 crc kubenswrapper[4707]: I0129 03:48:21.962504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerStarted","Data":"aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04"} Jan 29 03:48:21 crc kubenswrapper[4707]: I0129 03:48:21.963506 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 03:48:21 crc kubenswrapper[4707]: I0129 03:48:21.962793 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="sg-core" containerID="cri-o://2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8" gracePeriod=30 Jan 29 03:48:21 crc kubenswrapper[4707]: I0129 03:48:21.962711 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="ceilometer-central-agent" containerID="cri-o://84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1" gracePeriod=30 Jan 29 03:48:21 crc kubenswrapper[4707]: I0129 03:48:21.962819 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="proxy-httpd" containerID="cri-o://aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04" gracePeriod=30 Jan 29 03:48:21 crc kubenswrapper[4707]: I0129 03:48:21.962834 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="ceilometer-notification-agent" containerID="cri-o://54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5" gracePeriod=30 Jan 29 03:48:22 crc kubenswrapper[4707]: I0129 03:48:22.003395 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.275688563 podStartE2EDuration="7.003367607s" podCreationTimestamp="2026-01-29 03:48:15 +0000 UTC" firstStartedPulling="2026-01-29 03:48:16.812571129 +0000 UTC m=+1250.296800044" lastFinishedPulling="2026-01-29 03:48:20.540250183 +0000 UTC m=+1254.024479088" observedRunningTime="2026-01-29 03:48:21.999891768 +0000 UTC m=+1255.484120673" watchObservedRunningTime="2026-01-29 03:48:22.003367607 +0000 UTC m=+1255.487596512" Jan 29 03:48:22 crc kubenswrapper[4707]: I0129 03:48:22.995993 4707 generic.go:334] "Generic (PLEG): container finished" podID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerID="aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04" exitCode=0 Jan 29 03:48:22 crc kubenswrapper[4707]: I0129 03:48:22.996416 4707 generic.go:334] "Generic (PLEG): container finished" podID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerID="2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8" exitCode=2 Jan 29 03:48:22 crc kubenswrapper[4707]: I0129 03:48:22.996426 4707 generic.go:334] "Generic (PLEG): container finished" podID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerID="54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5" exitCode=0 Jan 29 03:48:22 crc kubenswrapper[4707]: I0129 03:48:22.996078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerDied","Data":"aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04"} Jan 29 03:48:22 crc kubenswrapper[4707]: I0129 03:48:22.996462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerDied","Data":"2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8"} Jan 29 03:48:22 crc kubenswrapper[4707]: I0129 03:48:22.996473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerDied","Data":"54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5"} Jan 29 03:48:24 crc kubenswrapper[4707]: E0129 03:48:24.294405 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:24 crc kubenswrapper[4707]: E0129 03:48:24.296495 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:24 crc kubenswrapper[4707]: E0129 03:48:24.299208 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:24 crc kubenswrapper[4707]: E0129 03:48:24.299238 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:27 crc kubenswrapper[4707]: I0129 03:48:27.923284 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.064183 4707 generic.go:334] "Generic (PLEG): container finished" podID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerID="84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1" exitCode=0 Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.064258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerDied","Data":"84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1"} Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.064322 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.064350 4707 scope.go:117] "RemoveContainer" containerID="aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.064332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"977fec89-9e03-4b52-86aa-ec04969e7b45","Type":"ContainerDied","Data":"b83ad2074c06c03a563a0e6e16a12dbeb7f46b64504d4c6b8c611d53999e40fe"} Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.090659 4707 scope.go:117] "RemoveContainer" containerID="2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.106675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-scripts\") pod \"977fec89-9e03-4b52-86aa-ec04969e7b45\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.107153 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk7kl\" (UniqueName: \"kubernetes.io/projected/977fec89-9e03-4b52-86aa-ec04969e7b45-kube-api-access-gk7kl\") pod \"977fec89-9e03-4b52-86aa-ec04969e7b45\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.107194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-run-httpd\") pod \"977fec89-9e03-4b52-86aa-ec04969e7b45\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.107331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-config-data\") pod \"977fec89-9e03-4b52-86aa-ec04969e7b45\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.107425 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-log-httpd\") pod \"977fec89-9e03-4b52-86aa-ec04969e7b45\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.107472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-sg-core-conf-yaml\") pod \"977fec89-9e03-4b52-86aa-ec04969e7b45\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.107520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-combined-ca-bundle\") pod \"977fec89-9e03-4b52-86aa-ec04969e7b45\" (UID: \"977fec89-9e03-4b52-86aa-ec04969e7b45\") " Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.107857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "977fec89-9e03-4b52-86aa-ec04969e7b45" (UID: "977fec89-9e03-4b52-86aa-ec04969e7b45"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.108040 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "977fec89-9e03-4b52-86aa-ec04969e7b45" (UID: "977fec89-9e03-4b52-86aa-ec04969e7b45"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.108332 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.108356 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/977fec89-9e03-4b52-86aa-ec04969e7b45-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.114990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977fec89-9e03-4b52-86aa-ec04969e7b45-kube-api-access-gk7kl" (OuterVolumeSpecName: "kube-api-access-gk7kl") pod "977fec89-9e03-4b52-86aa-ec04969e7b45" (UID: "977fec89-9e03-4b52-86aa-ec04969e7b45"). InnerVolumeSpecName "kube-api-access-gk7kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.119694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-scripts" (OuterVolumeSpecName: "scripts") pod "977fec89-9e03-4b52-86aa-ec04969e7b45" (UID: "977fec89-9e03-4b52-86aa-ec04969e7b45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.133622 4707 scope.go:117] "RemoveContainer" containerID="54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.142662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "977fec89-9e03-4b52-86aa-ec04969e7b45" (UID: "977fec89-9e03-4b52-86aa-ec04969e7b45"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.209314 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.209349 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk7kl\" (UniqueName: \"kubernetes.io/projected/977fec89-9e03-4b52-86aa-ec04969e7b45-kube-api-access-gk7kl\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.209364 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.212912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "977fec89-9e03-4b52-86aa-ec04969e7b45" (UID: "977fec89-9e03-4b52-86aa-ec04969e7b45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.220149 4707 scope.go:117] "RemoveContainer" containerID="84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.242928 4707 scope.go:117] "RemoveContainer" containerID="aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04" Jan 29 03:48:28 crc kubenswrapper[4707]: E0129 03:48:28.243628 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04\": container with ID starting with aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04 not found: ID does not exist" containerID="aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.243688 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04"} err="failed to get container status \"aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04\": rpc error: code = NotFound desc = could not find container \"aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04\": container with ID starting with aad9535c6cb3c940aa901881ca960cde5c1eda6c2d21015588885351962b2e04 not found: ID does not exist" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.243752 4707 scope.go:117] "RemoveContainer" containerID="2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.244187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-config-data" (OuterVolumeSpecName: "config-data") pod "977fec89-9e03-4b52-86aa-ec04969e7b45" (UID: "977fec89-9e03-4b52-86aa-ec04969e7b45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:28 crc kubenswrapper[4707]: E0129 03:48:28.244203 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8\": container with ID starting with 2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8 not found: ID does not exist" containerID="2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.244428 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8"} err="failed to get container status \"2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8\": rpc error: code = NotFound desc = could not find container \"2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8\": container with ID starting with 2d3a6ba9db5eed97f76cc7f39186b5ef5f1fa27a306181554d244f31970c0df8 not found: ID does not exist" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.244603 4707 scope.go:117] "RemoveContainer" containerID="54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5" Jan 29 03:48:28 crc kubenswrapper[4707]: E0129 03:48:28.245185 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5\": container with ID starting with 54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5 not found: ID does not exist" containerID="54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.245311 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5"} err="failed to get container status \"54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5\": rpc error: code = NotFound desc = could not find container \"54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5\": container with ID starting with 54355b8d15e5b4b75f133ee367967643515738e1f82f3f9f17d6c00385df81b5 not found: ID does not exist" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.245409 4707 scope.go:117] "RemoveContainer" containerID="84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1" Jan 29 03:48:28 crc kubenswrapper[4707]: E0129 03:48:28.246092 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1\": container with ID starting with 84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1 not found: ID does not exist" containerID="84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.246139 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1"} err="failed to get container status \"84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1\": rpc error: code = NotFound desc = could not find container \"84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1\": container with ID starting with 84a6617bafe1a5d4f4aab165ba9a80b098d1fbca99ebde6cf27fe079202b91c1 not found: ID does not exist" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.312520 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.313381 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/977fec89-9e03-4b52-86aa-ec04969e7b45-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.410778 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.427632 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.440272 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:28 crc kubenswrapper[4707]: E0129 03:48:28.440773 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="ceilometer-notification-agent" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.440790 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="ceilometer-notification-agent" Jan 29 03:48:28 crc kubenswrapper[4707]: E0129 03:48:28.440807 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="proxy-httpd" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.440813 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="proxy-httpd" Jan 29 03:48:28 crc kubenswrapper[4707]: E0129 03:48:28.440835 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="ceilometer-central-agent" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.440841 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="ceilometer-central-agent" Jan 29 03:48:28 crc kubenswrapper[4707]: E0129 03:48:28.440871 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="sg-core" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.440876 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="sg-core" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.441048 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="ceilometer-notification-agent" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.441063 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="ceilometer-central-agent" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.441080 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="sg-core" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.441089 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" containerName="proxy-httpd" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.442861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.447014 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.447312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.457150 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.517530 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-scripts\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.517685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.517783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-log-httpd\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.517852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-config-data\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.517948 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-466tz\" (UniqueName: \"kubernetes.io/projected/c86d5142-df8a-443d-9cd0-127c672072b8-kube-api-access-466tz\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.518005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.518079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-run-httpd\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.618851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-scripts\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.618955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.619017 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-log-httpd\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.619056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-config-data\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.619128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-466tz\" (UniqueName: \"kubernetes.io/projected/c86d5142-df8a-443d-9cd0-127c672072b8-kube-api-access-466tz\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.619164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.619209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-run-httpd\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.620825 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-log-httpd\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.621195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-run-httpd\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.625495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-config-data\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.626264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.632374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.635421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-scripts\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.640760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-466tz\" (UniqueName: \"kubernetes.io/projected/c86d5142-df8a-443d-9cd0-127c672072b8-kube-api-access-466tz\") pod \"ceilometer-0\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " pod="openstack/ceilometer-0" Jan 29 03:48:28 crc kubenswrapper[4707]: I0129 03:48:28.762346 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.256596 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="977fec89-9e03-4b52-86aa-ec04969e7b45" path="/var/lib/kubelet/pods/977fec89-9e03-4b52-86aa-ec04969e7b45/volumes" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.275721 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:48:29 crc kubenswrapper[4707]: E0129 03:48:29.296098 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:29 crc kubenswrapper[4707]: E0129 03:48:29.298378 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:29 crc kubenswrapper[4707]: E0129 03:48:29.300318 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:29 crc kubenswrapper[4707]: E0129 03:48:29.300356 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.692069 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.840873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gkpz\" (UniqueName: \"kubernetes.io/projected/0ec32f0d-8609-4579-abc4-1af5f98df4cf-kube-api-access-4gkpz\") pod \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.840990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data-custom\") pod \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.841112 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data\") pod \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.841250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-combined-ca-bundle\") pod \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\" (UID: \"0ec32f0d-8609-4579-abc4-1af5f98df4cf\") " Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.849950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec32f0d-8609-4579-abc4-1af5f98df4cf-kube-api-access-4gkpz" (OuterVolumeSpecName: "kube-api-access-4gkpz") pod "0ec32f0d-8609-4579-abc4-1af5f98df4cf" (UID: "0ec32f0d-8609-4579-abc4-1af5f98df4cf"). InnerVolumeSpecName "kube-api-access-4gkpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.859906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0ec32f0d-8609-4579-abc4-1af5f98df4cf" (UID: "0ec32f0d-8609-4579-abc4-1af5f98df4cf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.872629 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ec32f0d-8609-4579-abc4-1af5f98df4cf" (UID: "0ec32f0d-8609-4579-abc4-1af5f98df4cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.935878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data" (OuterVolumeSpecName: "config-data") pod "0ec32f0d-8609-4579-abc4-1af5f98df4cf" (UID: "0ec32f0d-8609-4579-abc4-1af5f98df4cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.944834 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.944869 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gkpz\" (UniqueName: \"kubernetes.io/projected/0ec32f0d-8609-4579-abc4-1af5f98df4cf-kube-api-access-4gkpz\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.944886 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:29 crc kubenswrapper[4707]: I0129 03:48:29.944895 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec32f0d-8609-4579-abc4-1af5f98df4cf-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.103715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerStarted","Data":"9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113"} Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.104699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerStarted","Data":"b647ccc2e1f1e9e36dd01c7142cfe618c90741a1e0421d1157d28f1a0797c68a"} Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.105664 4707 generic.go:334] "Generic (PLEG): container finished" podID="0ec32f0d-8609-4579-abc4-1af5f98df4cf" containerID="35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939" exitCode=137 Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.105707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" event={"ID":"0ec32f0d-8609-4579-abc4-1af5f98df4cf","Type":"ContainerDied","Data":"35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939"} Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.105729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" event={"ID":"0ec32f0d-8609-4579-abc4-1af5f98df4cf","Type":"ContainerDied","Data":"527a3d9d334bc033a1a2dcde68929c2df7846106ed5f96e935c5431d1a7b5dee"} Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.105750 4707 scope.go:117] "RemoveContainer" containerID="35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939" Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.105941 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7bdf7467d4-ptkdm" Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.141399 4707 scope.go:117] "RemoveContainer" containerID="35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939" Jan 29 03:48:30 crc kubenswrapper[4707]: E0129 03:48:30.144260 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939\": container with ID starting with 35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939 not found: ID does not exist" containerID="35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939" Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.144321 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939"} err="failed to get container status \"35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939\": rpc error: code = NotFound desc = could not find container \"35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939\": container with ID starting with 35bb3c83569ec62bc6d9259fbd0ca6191c43065f82a204f63857b8016c323939 not found: ID does not exist" Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.151237 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7bdf7467d4-ptkdm"] Jan 29 03:48:30 crc kubenswrapper[4707]: I0129 03:48:30.161008 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7bdf7467d4-ptkdm"] Jan 29 03:48:31 crc kubenswrapper[4707]: I0129 03:48:31.116381 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerStarted","Data":"3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef"} Jan 29 03:48:31 crc kubenswrapper[4707]: I0129 03:48:31.254940 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec32f0d-8609-4579-abc4-1af5f98df4cf" path="/var/lib/kubelet/pods/0ec32f0d-8609-4579-abc4-1af5f98df4cf/volumes" Jan 29 03:48:32 crc kubenswrapper[4707]: I0129 03:48:32.129222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerStarted","Data":"cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9"} Jan 29 03:48:34 crc kubenswrapper[4707]: I0129 03:48:34.158359 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerStarted","Data":"4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d"} Jan 29 03:48:34 crc kubenswrapper[4707]: I0129 03:48:34.159083 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 03:48:34 crc kubenswrapper[4707]: E0129 03:48:34.294510 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:34 crc kubenswrapper[4707]: E0129 03:48:34.296178 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:34 crc kubenswrapper[4707]: E0129 03:48:34.298881 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:34 crc kubenswrapper[4707]: E0129 03:48:34.298913 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:39 crc kubenswrapper[4707]: E0129 03:48:39.301340 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:39 crc kubenswrapper[4707]: E0129 03:48:39.306241 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:39 crc kubenswrapper[4707]: E0129 03:48:39.308311 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:39 crc kubenswrapper[4707]: E0129 03:48:39.308387 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:44 crc kubenswrapper[4707]: E0129 03:48:44.295260 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:44 crc kubenswrapper[4707]: E0129 03:48:44.298875 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:44 crc kubenswrapper[4707]: E0129 03:48:44.301092 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 03:48:44 crc kubenswrapper[4707]: E0129 03:48:44.301232 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.343125 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" exitCode=137 Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.344194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd","Type":"ContainerDied","Data":"0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361"} Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.583154 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.617352 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=16.846989261 podStartE2EDuration="20.617331395s" podCreationTimestamp="2026-01-29 03:48:28 +0000 UTC" firstStartedPulling="2026-01-29 03:48:29.288398481 +0000 UTC m=+1262.772627386" lastFinishedPulling="2026-01-29 03:48:33.058740575 +0000 UTC m=+1266.542969520" observedRunningTime="2026-01-29 03:48:34.188469232 +0000 UTC m=+1267.672698157" watchObservedRunningTime="2026-01-29 03:48:48.617331395 +0000 UTC m=+1282.101560300" Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.685237 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkvqm\" (UniqueName: \"kubernetes.io/projected/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-kube-api-access-jkvqm\") pod \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.685834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-config-data\") pod \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.685883 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-combined-ca-bundle\") pod \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\" (UID: \"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd\") " Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.705980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-kube-api-access-jkvqm" (OuterVolumeSpecName: "kube-api-access-jkvqm") pod "e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" (UID: "e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd"). InnerVolumeSpecName "kube-api-access-jkvqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.722455 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" (UID: "e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.742860 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-config-data" (OuterVolumeSpecName: "config-data") pod "e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" (UID: "e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.788339 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkvqm\" (UniqueName: \"kubernetes.io/projected/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-kube-api-access-jkvqm\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.788379 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:48 crc kubenswrapper[4707]: I0129 03:48:48.788392 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.357041 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd","Type":"ContainerDied","Data":"10656f5f1e3731a39c02de41339dedb6439499941671903a9f37431eed541bc3"} Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.357112 4707 scope.go:117] "RemoveContainer" containerID="0254ef6bb6ae12e563ca3ab83a0b697241127d2c26653182def06de5ba0f0361" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.357160 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.408153 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.424232 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.447286 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:49 crc kubenswrapper[4707]: E0129 03:48:49.447875 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.447897 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:49 crc kubenswrapper[4707]: E0129 03:48:49.447917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec32f0d-8609-4579-abc4-1af5f98df4cf" containerName="heat-cfnapi" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.447925 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec32f0d-8609-4579-abc4-1af5f98df4cf" containerName="heat-cfnapi" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.448166 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" containerName="nova-cell0-conductor-conductor" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.448193 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec32f0d-8609-4579-abc4-1af5f98df4cf" containerName="heat-cfnapi" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.448962 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.455241 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pkw2v" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.455557 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.478177 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.622038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66aa645d-9edf-4791-a8f7-2607ad442104-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"66aa645d-9edf-4791-a8f7-2607ad442104\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.622180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvbw2\" (UniqueName: \"kubernetes.io/projected/66aa645d-9edf-4791-a8f7-2607ad442104-kube-api-access-bvbw2\") pod \"nova-cell0-conductor-0\" (UID: \"66aa645d-9edf-4791-a8f7-2607ad442104\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.622228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aa645d-9edf-4791-a8f7-2607ad442104-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"66aa645d-9edf-4791-a8f7-2607ad442104\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.724273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvbw2\" (UniqueName: \"kubernetes.io/projected/66aa645d-9edf-4791-a8f7-2607ad442104-kube-api-access-bvbw2\") pod \"nova-cell0-conductor-0\" (UID: \"66aa645d-9edf-4791-a8f7-2607ad442104\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.724374 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aa645d-9edf-4791-a8f7-2607ad442104-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"66aa645d-9edf-4791-a8f7-2607ad442104\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.724466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66aa645d-9edf-4791-a8f7-2607ad442104-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"66aa645d-9edf-4791-a8f7-2607ad442104\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.729475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66aa645d-9edf-4791-a8f7-2607ad442104-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"66aa645d-9edf-4791-a8f7-2607ad442104\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.731701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aa645d-9edf-4791-a8f7-2607ad442104-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"66aa645d-9edf-4791-a8f7-2607ad442104\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.746338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvbw2\" (UniqueName: \"kubernetes.io/projected/66aa645d-9edf-4791-a8f7-2607ad442104-kube-api-access-bvbw2\") pod \"nova-cell0-conductor-0\" (UID: \"66aa645d-9edf-4791-a8f7-2607ad442104\") " pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:49 crc kubenswrapper[4707]: I0129 03:48:49.777952 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:50 crc kubenswrapper[4707]: I0129 03:48:50.356946 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 03:48:50 crc kubenswrapper[4707]: W0129 03:48:50.365588 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66aa645d_9edf_4791_a8f7_2607ad442104.slice/crio-cf2d75a71ecd2dc277ac1a1a111320edf098591caf79289cad4f46ae17c944b9 WatchSource:0}: Error finding container cf2d75a71ecd2dc277ac1a1a111320edf098591caf79289cad4f46ae17c944b9: Status 404 returned error can't find the container with id cf2d75a71ecd2dc277ac1a1a111320edf098591caf79289cad4f46ae17c944b9 Jan 29 03:48:51 crc kubenswrapper[4707]: I0129 03:48:51.257116 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd" path="/var/lib/kubelet/pods/e6b783a8-a9cd-43fe-adfc-ef70d2eff3bd/volumes" Jan 29 03:48:51 crc kubenswrapper[4707]: I0129 03:48:51.397201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"66aa645d-9edf-4791-a8f7-2607ad442104","Type":"ContainerStarted","Data":"8e42e44d941e9d16d246c2d9026cae3c462264e5a14412ed185348061137da05"} Jan 29 03:48:51 crc kubenswrapper[4707]: I0129 03:48:51.397297 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"66aa645d-9edf-4791-a8f7-2607ad442104","Type":"ContainerStarted","Data":"cf2d75a71ecd2dc277ac1a1a111320edf098591caf79289cad4f46ae17c944b9"} Jan 29 03:48:51 crc kubenswrapper[4707]: I0129 03:48:51.397457 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 29 03:48:58 crc kubenswrapper[4707]: I0129 03:48:58.769630 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 29 03:48:58 crc kubenswrapper[4707]: I0129 03:48:58.799276 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=9.799254456 podStartE2EDuration="9.799254456s" podCreationTimestamp="2026-01-29 03:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:48:51.430717777 +0000 UTC m=+1284.914946732" watchObservedRunningTime="2026-01-29 03:48:58.799254456 +0000 UTC m=+1292.283483361" Jan 29 03:48:59 crc kubenswrapper[4707]: I0129 03:48:59.813206 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.329750 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vm7jk"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.331016 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.333646 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.338271 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.338454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-config-data\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.338493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-scripts\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.338663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnmtt\" (UniqueName: \"kubernetes.io/projected/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-kube-api-access-nnmtt\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.342971 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.362059 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vm7jk"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.441584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-config-data\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.441643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-scripts\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.441689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnmtt\" (UniqueName: \"kubernetes.io/projected/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-kube-api-access-nnmtt\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.441783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.450891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-config-data\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.470197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.479886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-scripts\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.499745 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnmtt\" (UniqueName: \"kubernetes.io/projected/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-kube-api-access-nnmtt\") pod \"nova-cell0-cell-mapping-vm7jk\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.541177 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.542814 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.555955 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.556316 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.657219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-config-data\") pod \"nova-scheduler-0\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.657328 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.657402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdlz\" (UniqueName: \"kubernetes.io/projected/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-kube-api-access-lfdlz\") pod \"nova-scheduler-0\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.681108 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.689300 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.691421 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.695888 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.742612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.763528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-config-data\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.763600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.763670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdlz\" (UniqueName: \"kubernetes.io/projected/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-kube-api-access-lfdlz\") pod \"nova-scheduler-0\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.763703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-config-data\") pod \"nova-scheduler-0\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.763722 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.763740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82fkd\" (UniqueName: \"kubernetes.io/projected/d97051dd-853f-4295-a509-3bd719a6a907-kube-api-access-82fkd\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.763757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d97051dd-853f-4295-a509-3bd719a6a907-logs\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.779226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-config-data\") pod \"nova-scheduler-0\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.783878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.813140 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-kkkwk"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.814795 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.827350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdlz\" (UniqueName: \"kubernetes.io/projected/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-kube-api-access-lfdlz\") pod \"nova-scheduler-0\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.864820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.864861 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82fkd\" (UniqueName: \"kubernetes.io/projected/d97051dd-853f-4295-a509-3bd719a6a907-kube-api-access-82fkd\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.864884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d97051dd-853f-4295-a509-3bd719a6a907-logs\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.864943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-config-data\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.866095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d97051dd-853f-4295-a509-3bd719a6a907-logs\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.868451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-config-data\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.869016 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-kkkwk"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.873177 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.895287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82fkd\" (UniqueName: \"kubernetes.io/projected/d97051dd-853f-4295-a509-3bd719a6a907-kube-api-access-82fkd\") pod \"nova-metadata-0\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " pod="openstack/nova-metadata-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.914612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.933441 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.941688 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.946216 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.961885 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.969426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgspp\" (UniqueName: \"kubernetes.io/projected/74e85360-0f32-4475-a5e8-9af4d4085841-kube-api-access-qgspp\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.970770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.971054 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.971186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-config\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.971275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-svc\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.971380 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.972028 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.973498 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:00 crc kubenswrapper[4707]: I0129 03:49:00.984068 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.015194 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-config\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-svc\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073172 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073196 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp7rv\" (UniqueName: \"kubernetes.io/projected/3df36f3e-f1ff-4a7a-94f5-251e977949cc-kube-api-access-dp7rv\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgspp\" (UniqueName: \"kubernetes.io/projected/74e85360-0f32-4475-a5e8-9af4d4085841-kube-api-access-qgspp\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-config-data\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x754\" (UniqueName: \"kubernetes.io/projected/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-kube-api-access-7x754\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073348 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df36f3e-f1ff-4a7a-94f5-251e977949cc-logs\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.073681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.076008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.077287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-svc\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.078329 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.078436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.085188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-config\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.109302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgspp\" (UniqueName: \"kubernetes.io/projected/74e85360-0f32-4475-a5e8-9af4d4085841-kube-api-access-qgspp\") pod \"dnsmasq-dns-9b86998b5-kkkwk\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.175335 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.176475 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.176626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp7rv\" (UniqueName: \"kubernetes.io/projected/3df36f3e-f1ff-4a7a-94f5-251e977949cc-kube-api-access-dp7rv\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.176707 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.176749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-config-data\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.176781 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x754\" (UniqueName: \"kubernetes.io/projected/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-kube-api-access-7x754\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.176821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df36f3e-f1ff-4a7a-94f5-251e977949cc-logs\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.176843 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.177884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df36f3e-f1ff-4a7a-94f5-251e977949cc-logs\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.184419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.189294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-config-data\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.190158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.194375 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.201133 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x754\" (UniqueName: \"kubernetes.io/projected/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-kube-api-access-7x754\") pod \"nova-cell1-novncproxy-0\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.201723 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.202863 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp7rv\" (UniqueName: \"kubernetes.io/projected/3df36f3e-f1ff-4a7a-94f5-251e977949cc-kube-api-access-dp7rv\") pod \"nova-api-0\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.326466 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.345660 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.479330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vm7jk"] Jan 29 03:49:01 crc kubenswrapper[4707]: W0129 03:49:01.532272 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83e0e811_830f_4e5c_a3bb_0dd5be01cf3d.slice/crio-3f632aac2a75749156b81cbdce0c3118a791f4d4c17523382cc746a168f592a1 WatchSource:0}: Error finding container 3f632aac2a75749156b81cbdce0c3118a791f4d4c17523382cc746a168f592a1: Status 404 returned error can't find the container with id 3f632aac2a75749156b81cbdce0c3118a791f4d4c17523382cc746a168f592a1 Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.675811 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.717391 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 03:49:01 crc kubenswrapper[4707]: I0129 03:49:01.942963 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.154941 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-kkkwk"] Jan 29 03:49:02 crc kubenswrapper[4707]: W0129 03:49:02.160515 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74e85360_0f32_4475_a5e8_9af4d4085841.slice/crio-1adf593588d7b0e2bc26842217e19bfa2bcdf92a6341071f0860f3332fa77b07 WatchSource:0}: Error finding container 1adf593588d7b0e2bc26842217e19bfa2bcdf92a6341071f0860f3332fa77b07: Status 404 returned error can't find the container with id 1adf593588d7b0e2bc26842217e19bfa2bcdf92a6341071f0860f3332fa77b07 Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.237194 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vqq2"] Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.238841 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.241520 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.251763 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.299761 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vqq2"] Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.316111 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.322506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcmdm\" (UniqueName: \"kubernetes.io/projected/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-kube-api-access-vcmdm\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.322586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-config-data\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.322665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-scripts\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.322735 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.376511 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.425813 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcmdm\" (UniqueName: \"kubernetes.io/projected/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-kube-api-access-vcmdm\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.425892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-config-data\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.425942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-scripts\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.425988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.433253 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-config-data\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.434172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-scripts\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.438155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.443704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcmdm\" (UniqueName: \"kubernetes.io/projected/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-kube-api-access-vcmdm\") pod \"nova-cell1-conductor-db-sync-5vqq2\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.579960 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.625309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"654bc6e4-18a2-4da6-a30a-9af36c8df0ec","Type":"ContainerStarted","Data":"c9ced0953b0dd37cabdcb8cce222d34a69e6525c03964d0877a4af69f29ecc4f"} Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.637898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3df36f3e-f1ff-4a7a-94f5-251e977949cc","Type":"ContainerStarted","Data":"3085370e438d89cc2efb6316eef504add112e056341b1a2d7e3db3aa29afc2f6"} Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.640780 4707 generic.go:334] "Generic (PLEG): container finished" podID="74e85360-0f32-4475-a5e8-9af4d4085841" containerID="077108a8d9306f14acd08aeef68427d2b33186b6664b47ae784cdb80e983578b" exitCode=0 Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.640865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" event={"ID":"74e85360-0f32-4475-a5e8-9af4d4085841","Type":"ContainerDied","Data":"077108a8d9306f14acd08aeef68427d2b33186b6664b47ae784cdb80e983578b"} Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.640898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" event={"ID":"74e85360-0f32-4475-a5e8-9af4d4085841","Type":"ContainerStarted","Data":"1adf593588d7b0e2bc26842217e19bfa2bcdf92a6341071f0860f3332fa77b07"} Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.643303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vm7jk" event={"ID":"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d","Type":"ContainerStarted","Data":"89cdc2f080c47e24de0dc68ad9f0a1af58af820c56016721a5bde4bfcabf3422"} Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.643330 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vm7jk" event={"ID":"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d","Type":"ContainerStarted","Data":"3f632aac2a75749156b81cbdce0c3118a791f4d4c17523382cc746a168f592a1"} Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.655474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f","Type":"ContainerStarted","Data":"fc46cb4a430b2a5abd089ff397272a996b1f40bfd6c4baabe044e737391658a5"} Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.661623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d97051dd-853f-4295-a509-3bd719a6a907","Type":"ContainerStarted","Data":"c437c2b305f70b2fc36be5361508f782ed6ec8eca7b694372918d118cf35044b"} Jan 29 03:49:02 crc kubenswrapper[4707]: I0129 03:49:02.721712 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vm7jk" podStartSLOduration=2.721685334 podStartE2EDuration="2.721685334s" podCreationTimestamp="2026-01-29 03:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:02.700749406 +0000 UTC m=+1296.184978311" watchObservedRunningTime="2026-01-29 03:49:02.721685334 +0000 UTC m=+1296.205914239" Jan 29 03:49:03 crc kubenswrapper[4707]: I0129 03:49:03.200458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vqq2"] Jan 29 03:49:03 crc kubenswrapper[4707]: W0129 03:49:03.227158 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podededf3cb_8e51_4ba6_a3d1_dc12980e13d1.slice/crio-06b8db512e65f764adedd7be3a2c5e1351e86f7ef58e696a6fc88e7d139479d6 WatchSource:0}: Error finding container 06b8db512e65f764adedd7be3a2c5e1351e86f7ef58e696a6fc88e7d139479d6: Status 404 returned error can't find the container with id 06b8db512e65f764adedd7be3a2c5e1351e86f7ef58e696a6fc88e7d139479d6 Jan 29 03:49:03 crc kubenswrapper[4707]: I0129 03:49:03.476969 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:49:03 crc kubenswrapper[4707]: I0129 03:49:03.477364 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:49:03 crc kubenswrapper[4707]: I0129 03:49:03.679850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vqq2" event={"ID":"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1","Type":"ContainerStarted","Data":"9f316524c65dbf31e260d11849fb730356d7a03a052a854227521385e116f055"} Jan 29 03:49:03 crc kubenswrapper[4707]: I0129 03:49:03.679918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vqq2" event={"ID":"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1","Type":"ContainerStarted","Data":"06b8db512e65f764adedd7be3a2c5e1351e86f7ef58e696a6fc88e7d139479d6"} Jan 29 03:49:03 crc kubenswrapper[4707]: I0129 03:49:03.683861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" event={"ID":"74e85360-0f32-4475-a5e8-9af4d4085841","Type":"ContainerStarted","Data":"93a155f7f55325167c7cd607feca85760d572069f39cc7e0489c899961c25798"} Jan 29 03:49:03 crc kubenswrapper[4707]: I0129 03:49:03.683985 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:03 crc kubenswrapper[4707]: I0129 03:49:03.714253 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5vqq2" podStartSLOduration=1.714226813 podStartE2EDuration="1.714226813s" podCreationTimestamp="2026-01-29 03:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:03.703651041 +0000 UTC m=+1297.187879936" watchObservedRunningTime="2026-01-29 03:49:03.714226813 +0000 UTC m=+1297.198455718" Jan 29 03:49:03 crc kubenswrapper[4707]: I0129 03:49:03.728917 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" podStartSLOduration=3.728890432 podStartE2EDuration="3.728890432s" podCreationTimestamp="2026-01-29 03:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:03.725411722 +0000 UTC m=+1297.209640627" watchObservedRunningTime="2026-01-29 03:49:03.728890432 +0000 UTC m=+1297.213119337" Jan 29 03:49:04 crc kubenswrapper[4707]: I0129 03:49:04.340567 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:04 crc kubenswrapper[4707]: I0129 03:49:04.360173 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 03:49:06 crc kubenswrapper[4707]: I0129 03:49:06.736798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f","Type":"ContainerStarted","Data":"b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc"} Jan 29 03:49:06 crc kubenswrapper[4707]: I0129 03:49:06.737811 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="bc59e1a6-5382-4b11-86e1-1d5bb3864a6f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc" gracePeriod=30 Jan 29 03:49:06 crc kubenswrapper[4707]: I0129 03:49:06.744344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d97051dd-853f-4295-a509-3bd719a6a907","Type":"ContainerStarted","Data":"68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c"} Jan 29 03:49:06 crc kubenswrapper[4707]: I0129 03:49:06.750447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"654bc6e4-18a2-4da6-a30a-9af36c8df0ec","Type":"ContainerStarted","Data":"31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c"} Jan 29 03:49:06 crc kubenswrapper[4707]: I0129 03:49:06.752443 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3df36f3e-f1ff-4a7a-94f5-251e977949cc","Type":"ContainerStarted","Data":"1e027fad81ca7dfa8a4f749e38b8f6b282f522cfd7b1bb5d8e70b57e221edd89"} Jan 29 03:49:06 crc kubenswrapper[4707]: I0129 03:49:06.767781 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.998198456 podStartE2EDuration="6.767757827s" podCreationTimestamp="2026-01-29 03:49:00 +0000 UTC" firstStartedPulling="2026-01-29 03:49:02.405256794 +0000 UTC m=+1295.889485699" lastFinishedPulling="2026-01-29 03:49:06.174816165 +0000 UTC m=+1299.659045070" observedRunningTime="2026-01-29 03:49:06.760801058 +0000 UTC m=+1300.245029973" watchObservedRunningTime="2026-01-29 03:49:06.767757827 +0000 UTC m=+1300.251986732" Jan 29 03:49:06 crc kubenswrapper[4707]: I0129 03:49:06.790620 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.3248197360000002 podStartE2EDuration="6.790594709s" podCreationTimestamp="2026-01-29 03:49:00 +0000 UTC" firstStartedPulling="2026-01-29 03:49:01.709032442 +0000 UTC m=+1295.193261347" lastFinishedPulling="2026-01-29 03:49:06.174807415 +0000 UTC m=+1299.659036320" observedRunningTime="2026-01-29 03:49:06.780178911 +0000 UTC m=+1300.264407826" watchObservedRunningTime="2026-01-29 03:49:06.790594709 +0000 UTC m=+1300.274823624" Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.472282 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.472637 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b5a35c97-c2c4-4513-b755-774a90aa56ff" containerName="kube-state-metrics" containerID="cri-o://c066bc913675ca0557b9bd95e6a7ca5b9313106c96308e38ec843795049a8230" gracePeriod=30 Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.772932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3df36f3e-f1ff-4a7a-94f5-251e977949cc","Type":"ContainerStarted","Data":"46cc6c097bb1ec5c99a016d57ff85959148ea67be60e7fb165d9b2d1a56d1bef"} Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.781240 4707 generic.go:334] "Generic (PLEG): container finished" podID="b5a35c97-c2c4-4513-b755-774a90aa56ff" containerID="c066bc913675ca0557b9bd95e6a7ca5b9313106c96308e38ec843795049a8230" exitCode=2 Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.781355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b5a35c97-c2c4-4513-b755-774a90aa56ff","Type":"ContainerDied","Data":"c066bc913675ca0557b9bd95e6a7ca5b9313106c96308e38ec843795049a8230"} Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.795892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d97051dd-853f-4295-a509-3bd719a6a907","Type":"ContainerStarted","Data":"7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc"} Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.795995 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d97051dd-853f-4295-a509-3bd719a6a907" containerName="nova-metadata-log" containerID="cri-o://68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c" gracePeriod=30 Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.796351 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d97051dd-853f-4295-a509-3bd719a6a907" containerName="nova-metadata-metadata" containerID="cri-o://7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc" gracePeriod=30 Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.804343 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.941228839 podStartE2EDuration="7.804320653s" podCreationTimestamp="2026-01-29 03:49:00 +0000 UTC" firstStartedPulling="2026-01-29 03:49:02.316954131 +0000 UTC m=+1295.801183036" lastFinishedPulling="2026-01-29 03:49:06.180045945 +0000 UTC m=+1299.664274850" observedRunningTime="2026-01-29 03:49:07.802373497 +0000 UTC m=+1301.286602412" watchObservedRunningTime="2026-01-29 03:49:07.804320653 +0000 UTC m=+1301.288549558" Jan 29 03:49:07 crc kubenswrapper[4707]: I0129 03:49:07.863824 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.652675065 podStartE2EDuration="7.863800492s" podCreationTimestamp="2026-01-29 03:49:00 +0000 UTC" firstStartedPulling="2026-01-29 03:49:01.963667878 +0000 UTC m=+1295.447896783" lastFinishedPulling="2026-01-29 03:49:06.174793295 +0000 UTC m=+1299.659022210" observedRunningTime="2026-01-29 03:49:07.837610944 +0000 UTC m=+1301.321839859" watchObservedRunningTime="2026-01-29 03:49:07.863800492 +0000 UTC m=+1301.348029397" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.132009 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.192937 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j5wk\" (UniqueName: \"kubernetes.io/projected/b5a35c97-c2c4-4513-b755-774a90aa56ff-kube-api-access-6j5wk\") pod \"b5a35c97-c2c4-4513-b755-774a90aa56ff\" (UID: \"b5a35c97-c2c4-4513-b755-774a90aa56ff\") " Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.216894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a35c97-c2c4-4513-b755-774a90aa56ff-kube-api-access-6j5wk" (OuterVolumeSpecName: "kube-api-access-6j5wk") pod "b5a35c97-c2c4-4513-b755-774a90aa56ff" (UID: "b5a35c97-c2c4-4513-b755-774a90aa56ff"). InnerVolumeSpecName "kube-api-access-6j5wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.304878 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j5wk\" (UniqueName: \"kubernetes.io/projected/b5a35c97-c2c4-4513-b755-774a90aa56ff-kube-api-access-6j5wk\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.653039 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.820509 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d97051dd-853f-4295-a509-3bd719a6a907-logs\") pod \"d97051dd-853f-4295-a509-3bd719a6a907\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.820817 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-config-data\") pod \"d97051dd-853f-4295-a509-3bd719a6a907\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.820918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82fkd\" (UniqueName: \"kubernetes.io/projected/d97051dd-853f-4295-a509-3bd719a6a907-kube-api-access-82fkd\") pod \"d97051dd-853f-4295-a509-3bd719a6a907\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.820989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-combined-ca-bundle\") pod \"d97051dd-853f-4295-a509-3bd719a6a907\" (UID: \"d97051dd-853f-4295-a509-3bd719a6a907\") " Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.825292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d97051dd-853f-4295-a509-3bd719a6a907-logs" (OuterVolumeSpecName: "logs") pod "d97051dd-853f-4295-a509-3bd719a6a907" (UID: "d97051dd-853f-4295-a509-3bd719a6a907"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.830519 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97051dd-853f-4295-a509-3bd719a6a907-kube-api-access-82fkd" (OuterVolumeSpecName: "kube-api-access-82fkd") pod "d97051dd-853f-4295-a509-3bd719a6a907" (UID: "d97051dd-853f-4295-a509-3bd719a6a907"). InnerVolumeSpecName "kube-api-access-82fkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.839106 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.842026 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b5a35c97-c2c4-4513-b755-774a90aa56ff","Type":"ContainerDied","Data":"e2abadeef61bad37e791079ca4115eef9595733a6078cac8ff6a87cf21ad4575"} Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.842114 4707 scope.go:117] "RemoveContainer" containerID="c066bc913675ca0557b9bd95e6a7ca5b9313106c96308e38ec843795049a8230" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.861680 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-config-data" (OuterVolumeSpecName: "config-data") pod "d97051dd-853f-4295-a509-3bd719a6a907" (UID: "d97051dd-853f-4295-a509-3bd719a6a907"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.864292 4707 generic.go:334] "Generic (PLEG): container finished" podID="d97051dd-853f-4295-a509-3bd719a6a907" containerID="7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc" exitCode=0 Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.864335 4707 generic.go:334] "Generic (PLEG): container finished" podID="d97051dd-853f-4295-a509-3bd719a6a907" containerID="68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c" exitCode=143 Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.864712 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.865203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d97051dd-853f-4295-a509-3bd719a6a907","Type":"ContainerDied","Data":"7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc"} Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.865249 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d97051dd-853f-4295-a509-3bd719a6a907","Type":"ContainerDied","Data":"68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c"} Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.865263 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d97051dd-853f-4295-a509-3bd719a6a907","Type":"ContainerDied","Data":"c437c2b305f70b2fc36be5361508f782ed6ec8eca7b694372918d118cf35044b"} Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.878822 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d97051dd-853f-4295-a509-3bd719a6a907" (UID: "d97051dd-853f-4295-a509-3bd719a6a907"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.923229 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d97051dd-853f-4295-a509-3bd719a6a907-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.923263 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.923277 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82fkd\" (UniqueName: \"kubernetes.io/projected/d97051dd-853f-4295-a509-3bd719a6a907-kube-api-access-82fkd\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.923289 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97051dd-853f-4295-a509-3bd719a6a907-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.962355 4707 scope.go:117] "RemoveContainer" containerID="7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc" Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.979914 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 03:49:08 crc kubenswrapper[4707]: I0129 03:49:08.987122 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.000065 4707 scope.go:117] "RemoveContainer" containerID="68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.007429 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 03:49:09 crc kubenswrapper[4707]: E0129 03:49:09.013168 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a35c97-c2c4-4513-b755-774a90aa56ff" containerName="kube-state-metrics" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.013210 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a35c97-c2c4-4513-b755-774a90aa56ff" containerName="kube-state-metrics" Jan 29 03:49:09 crc kubenswrapper[4707]: E0129 03:49:09.013280 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97051dd-853f-4295-a509-3bd719a6a907" containerName="nova-metadata-log" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.013292 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97051dd-853f-4295-a509-3bd719a6a907" containerName="nova-metadata-log" Jan 29 03:49:09 crc kubenswrapper[4707]: E0129 03:49:09.013309 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97051dd-853f-4295-a509-3bd719a6a907" containerName="nova-metadata-metadata" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.013318 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97051dd-853f-4295-a509-3bd719a6a907" containerName="nova-metadata-metadata" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.014495 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d97051dd-853f-4295-a509-3bd719a6a907" containerName="nova-metadata-metadata" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.014532 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d97051dd-853f-4295-a509-3bd719a6a907" containerName="nova-metadata-log" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.014561 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a35c97-c2c4-4513-b755-774a90aa56ff" containerName="kube-state-metrics" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.015614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.020103 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.024615 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.029146 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.042875 4707 scope.go:117] "RemoveContainer" containerID="7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc" Jan 29 03:49:09 crc kubenswrapper[4707]: E0129 03:49:09.045774 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc\": container with ID starting with 7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc not found: ID does not exist" containerID="7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.046100 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc"} err="failed to get container status \"7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc\": rpc error: code = NotFound desc = could not find container \"7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc\": container with ID starting with 7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc not found: ID does not exist" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.046132 4707 scope.go:117] "RemoveContainer" containerID="68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c" Jan 29 03:49:09 crc kubenswrapper[4707]: E0129 03:49:09.046548 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c\": container with ID starting with 68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c not found: ID does not exist" containerID="68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.046576 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c"} err="failed to get container status \"68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c\": rpc error: code = NotFound desc = could not find container \"68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c\": container with ID starting with 68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c not found: ID does not exist" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.046589 4707 scope.go:117] "RemoveContainer" containerID="7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.046918 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc"} err="failed to get container status \"7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc\": rpc error: code = NotFound desc = could not find container \"7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc\": container with ID starting with 7488d89ff1353c42bf8cdc738a61545ff89ea987146cf8050aa52883adaae0bc not found: ID does not exist" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.046943 4707 scope.go:117] "RemoveContainer" containerID="68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.047783 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c"} err="failed to get container status \"68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c\": rpc error: code = NotFound desc = could not find container \"68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c\": container with ID starting with 68759ff60681f5c13950735dca6e402b685a42737a851d56998687828d65407c not found: ID does not exist" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.128637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6f586474-1963-4702-81bf-36d31bf0a3ae-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.128780 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f586474-1963-4702-81bf-36d31bf0a3ae-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.128824 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f586474-1963-4702-81bf-36d31bf0a3ae-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.128868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml9pv\" (UniqueName: \"kubernetes.io/projected/6f586474-1963-4702-81bf-36d31bf0a3ae-kube-api-access-ml9pv\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.214676 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.230825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f586474-1963-4702-81bf-36d31bf0a3ae-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.231690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f586474-1963-4702-81bf-36d31bf0a3ae-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.231772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml9pv\" (UniqueName: \"kubernetes.io/projected/6f586474-1963-4702-81bf-36d31bf0a3ae-kube-api-access-ml9pv\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.231864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6f586474-1963-4702-81bf-36d31bf0a3ae-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.232652 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.236136 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f586474-1963-4702-81bf-36d31bf0a3ae-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.237079 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f586474-1963-4702-81bf-36d31bf0a3ae-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.248168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6f586474-1963-4702-81bf-36d31bf0a3ae-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.254802 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml9pv\" (UniqueName: \"kubernetes.io/projected/6f586474-1963-4702-81bf-36d31bf0a3ae-kube-api-access-ml9pv\") pod \"kube-state-metrics-0\" (UID: \"6f586474-1963-4702-81bf-36d31bf0a3ae\") " pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.273118 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a35c97-c2c4-4513-b755-774a90aa56ff" path="/var/lib/kubelet/pods/b5a35c97-c2c4-4513-b755-774a90aa56ff/volumes" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.274048 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d97051dd-853f-4295-a509-3bd719a6a907" path="/var/lib/kubelet/pods/d97051dd-853f-4295-a509-3bd719a6a907/volumes" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.275320 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.278629 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.278950 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.283454 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.289163 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.339595 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.438181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92e3a65-8435-42f4-bd47-2c0b79439293-logs\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.438252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdb8m\" (UniqueName: \"kubernetes.io/projected/e92e3a65-8435-42f4-bd47-2c0b79439293-kube-api-access-pdb8m\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.438285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.438315 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-config-data\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.438460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.540978 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92e3a65-8435-42f4-bd47-2c0b79439293-logs\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.543506 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdb8m\" (UniqueName: \"kubernetes.io/projected/e92e3a65-8435-42f4-bd47-2c0b79439293-kube-api-access-pdb8m\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.543568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.543619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-config-data\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.543969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.541695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92e3a65-8435-42f4-bd47-2c0b79439293-logs\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.552425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-config-data\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.557287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.563091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdb8m\" (UniqueName: \"kubernetes.io/projected/e92e3a65-8435-42f4-bd47-2c0b79439293-kube-api-access-pdb8m\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.601638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.644092 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:09 crc kubenswrapper[4707]: I0129 03:49:09.891309 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 03:49:09 crc kubenswrapper[4707]: W0129 03:49:09.893930 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f586474_1963_4702_81bf_36d31bf0a3ae.slice/crio-c7543ecb7fd0c66c26d0fded5db84650e8fec608a5daca4e5901c59eb64af1f9 WatchSource:0}: Error finding container c7543ecb7fd0c66c26d0fded5db84650e8fec608a5daca4e5901c59eb64af1f9: Status 404 returned error can't find the container with id c7543ecb7fd0c66c26d0fded5db84650e8fec608a5daca4e5901c59eb64af1f9 Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.125073 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.125424 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="ceilometer-central-agent" containerID="cri-o://9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113" gracePeriod=30 Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.125720 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="proxy-httpd" containerID="cri-o://4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d" gracePeriod=30 Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.125795 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="sg-core" containerID="cri-o://cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9" gracePeriod=30 Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.125839 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="ceilometer-notification-agent" containerID="cri-o://3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef" gracePeriod=30 Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.177889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:10 crc kubenswrapper[4707]: W0129 03:49:10.183578 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode92e3a65_8435_42f4_bd47_2c0b79439293.slice/crio-823aa1334b73bba19a453c46458739a7d50363763dcaccbb73d9c36e24cb92d9 WatchSource:0}: Error finding container 823aa1334b73bba19a453c46458739a7d50363763dcaccbb73d9c36e24cb92d9: Status 404 returned error can't find the container with id 823aa1334b73bba19a453c46458739a7d50363763dcaccbb73d9c36e24cb92d9 Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.897790 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6f586474-1963-4702-81bf-36d31bf0a3ae","Type":"ContainerStarted","Data":"07b45912b6b3c7411cf05493174dfa2bf93949fc5f00fecee23e54992c7ec767"} Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.898263 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.898277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6f586474-1963-4702-81bf-36d31bf0a3ae","Type":"ContainerStarted","Data":"c7543ecb7fd0c66c26d0fded5db84650e8fec608a5daca4e5901c59eb64af1f9"} Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.901028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e92e3a65-8435-42f4-bd47-2c0b79439293","Type":"ContainerStarted","Data":"b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444"} Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.901062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e92e3a65-8435-42f4-bd47-2c0b79439293","Type":"ContainerStarted","Data":"14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff"} Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.901075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e92e3a65-8435-42f4-bd47-2c0b79439293","Type":"ContainerStarted","Data":"823aa1334b73bba19a453c46458739a7d50363763dcaccbb73d9c36e24cb92d9"} Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.904468 4707 generic.go:334] "Generic (PLEG): container finished" podID="c86d5142-df8a-443d-9cd0-127c672072b8" containerID="4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d" exitCode=0 Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.904528 4707 generic.go:334] "Generic (PLEG): container finished" podID="c86d5142-df8a-443d-9cd0-127c672072b8" containerID="cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9" exitCode=2 Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.904550 4707 generic.go:334] "Generic (PLEG): container finished" podID="c86d5142-df8a-443d-9cd0-127c672072b8" containerID="9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113" exitCode=0 Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.904580 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerDied","Data":"4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d"} Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.904756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerDied","Data":"cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9"} Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.904774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerDied","Data":"9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113"} Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.915836 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.916161 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.939552 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.553574493 podStartE2EDuration="2.9395108s" podCreationTimestamp="2026-01-29 03:49:08 +0000 UTC" firstStartedPulling="2026-01-29 03:49:09.898378463 +0000 UTC m=+1303.382607368" lastFinishedPulling="2026-01-29 03:49:10.28431477 +0000 UTC m=+1303.768543675" observedRunningTime="2026-01-29 03:49:10.919007603 +0000 UTC m=+1304.403236508" watchObservedRunningTime="2026-01-29 03:49:10.9395108 +0000 UTC m=+1304.423739705" Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.951416 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.951389119 podStartE2EDuration="1.951389119s" podCreationTimestamp="2026-01-29 03:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:10.943230206 +0000 UTC m=+1304.427459111" watchObservedRunningTime="2026-01-29 03:49:10.951389119 +0000 UTC m=+1304.435618024" Jan 29 03:49:10 crc kubenswrapper[4707]: I0129 03:49:10.975577 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.203762 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.280440 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-5hptj"] Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.280794 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" podUID="b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" containerName="dnsmasq-dns" containerID="cri-o://a61a8c84b6b80ff80b6088aedac76a006021d4d1797b4c1f84fa01f85dfc71d4" gracePeriod=10 Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.327555 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.327611 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.346885 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.917485 4707 generic.go:334] "Generic (PLEG): container finished" podID="b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" containerID="a61a8c84b6b80ff80b6088aedac76a006021d4d1797b4c1f84fa01f85dfc71d4" exitCode=0 Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.917574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" event={"ID":"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f","Type":"ContainerDied","Data":"a61a8c84b6b80ff80b6088aedac76a006021d4d1797b4c1f84fa01f85dfc71d4"} Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.917634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" event={"ID":"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f","Type":"ContainerDied","Data":"54532c5691f37a5cce1b5b2e5e0c661153e9047be70efb12c9e1dd3d47ffccd5"} Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.917653 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54532c5691f37a5cce1b5b2e5e0c661153e9047be70efb12c9e1dd3d47ffccd5" Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.919601 4707 generic.go:334] "Generic (PLEG): container finished" podID="83e0e811-830f-4e5c-a3bb-0dd5be01cf3d" containerID="89cdc2f080c47e24de0dc68ad9f0a1af58af820c56016721a5bde4bfcabf3422" exitCode=0 Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.919730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vm7jk" event={"ID":"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d","Type":"ContainerDied","Data":"89cdc2f080c47e24de0dc68ad9f0a1af58af820c56016721a5bde4bfcabf3422"} Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.956899 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 03:49:11 crc kubenswrapper[4707]: I0129 03:49:11.958529 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.015386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-swift-storage-0\") pod \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.015433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf6k2\" (UniqueName: \"kubernetes.io/projected/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-kube-api-access-zf6k2\") pod \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.015593 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-nb\") pod \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.015680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-config\") pod \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.015719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-svc\") pod \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.015760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-sb\") pod \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\" (UID: \"b3a7e44d-1a9d-432f-a68b-d4cd671fef9f\") " Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.058975 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-kube-api-access-zf6k2" (OuterVolumeSpecName: "kube-api-access-zf6k2") pod "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" (UID: "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f"). InnerVolumeSpecName "kube-api-access-zf6k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.100727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" (UID: "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.103059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" (UID: "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.118245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-config" (OuterVolumeSpecName: "config") pod "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" (UID: "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.129423 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf6k2\" (UniqueName: \"kubernetes.io/projected/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-kube-api-access-zf6k2\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.129471 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.129483 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.129494 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.135967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" (UID: "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.138474 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" (UID: "b3a7e44d-1a9d-432f-a68b-d4cd671fef9f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.232951 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.232996 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.410752 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.410757 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.929969 4707 generic.go:334] "Generic (PLEG): container finished" podID="ededf3cb-8e51-4ba6-a3d1-dc12980e13d1" containerID="9f316524c65dbf31e260d11849fb730356d7a03a052a854227521385e116f055" exitCode=0 Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.930051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vqq2" event={"ID":"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1","Type":"ContainerDied","Data":"9f316524c65dbf31e260d11849fb730356d7a03a052a854227521385e116f055"} Jan 29 03:49:12 crc kubenswrapper[4707]: I0129 03:49:12.930661 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-5hptj" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.001317 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-5hptj"] Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.012764 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-5hptj"] Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.256127 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" path="/var/lib/kubelet/pods/b3a7e44d-1a9d-432f-a68b-d4cd671fef9f/volumes" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.396182 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.485879 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-config-data\") pod \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.485986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnmtt\" (UniqueName: \"kubernetes.io/projected/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-kube-api-access-nnmtt\") pod \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.486258 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-combined-ca-bundle\") pod \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.486361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-scripts\") pod \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\" (UID: \"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d\") " Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.496893 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-scripts" (OuterVolumeSpecName: "scripts") pod "83e0e811-830f-4e5c-a3bb-0dd5be01cf3d" (UID: "83e0e811-830f-4e5c-a3bb-0dd5be01cf3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.529986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-kube-api-access-nnmtt" (OuterVolumeSpecName: "kube-api-access-nnmtt") pod "83e0e811-830f-4e5c-a3bb-0dd5be01cf3d" (UID: "83e0e811-830f-4e5c-a3bb-0dd5be01cf3d"). InnerVolumeSpecName "kube-api-access-nnmtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.540643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83e0e811-830f-4e5c-a3bb-0dd5be01cf3d" (UID: "83e0e811-830f-4e5c-a3bb-0dd5be01cf3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.559576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-config-data" (OuterVolumeSpecName: "config-data") pod "83e0e811-830f-4e5c-a3bb-0dd5be01cf3d" (UID: "83e0e811-830f-4e5c-a3bb-0dd5be01cf3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.590002 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.590044 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnmtt\" (UniqueName: \"kubernetes.io/projected/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-kube-api-access-nnmtt\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.590060 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.590069 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.899116 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.993481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vm7jk" event={"ID":"83e0e811-830f-4e5c-a3bb-0dd5be01cf3d","Type":"ContainerDied","Data":"3f632aac2a75749156b81cbdce0c3118a791f4d4c17523382cc746a168f592a1"} Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.993868 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f632aac2a75749156b81cbdce0c3118a791f4d4c17523382cc746a168f592a1" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.993808 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vm7jk" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.998700 4707 generic.go:334] "Generic (PLEG): container finished" podID="c86d5142-df8a-443d-9cd0-127c672072b8" containerID="3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef" exitCode=0 Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.998810 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.998831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerDied","Data":"3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef"} Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.998905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c86d5142-df8a-443d-9cd0-127c672072b8","Type":"ContainerDied","Data":"b647ccc2e1f1e9e36dd01c7142cfe618c90741a1e0421d1157d28f1a0797c68a"} Jan 29 03:49:13 crc kubenswrapper[4707]: I0129 03:49:13.998933 4707 scope.go:117] "RemoveContainer" containerID="4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.001511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-config-data\") pod \"c86d5142-df8a-443d-9cd0-127c672072b8\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.001604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-sg-core-conf-yaml\") pod \"c86d5142-df8a-443d-9cd0-127c672072b8\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.001653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-466tz\" (UniqueName: \"kubernetes.io/projected/c86d5142-df8a-443d-9cd0-127c672072b8-kube-api-access-466tz\") pod \"c86d5142-df8a-443d-9cd0-127c672072b8\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.001678 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-scripts\") pod \"c86d5142-df8a-443d-9cd0-127c672072b8\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.001712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-log-httpd\") pod \"c86d5142-df8a-443d-9cd0-127c672072b8\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.001729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-combined-ca-bundle\") pod \"c86d5142-df8a-443d-9cd0-127c672072b8\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.001918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-run-httpd\") pod \"c86d5142-df8a-443d-9cd0-127c672072b8\" (UID: \"c86d5142-df8a-443d-9cd0-127c672072b8\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.003656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c86d5142-df8a-443d-9cd0-127c672072b8" (UID: "c86d5142-df8a-443d-9cd0-127c672072b8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.004012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c86d5142-df8a-443d-9cd0-127c672072b8" (UID: "c86d5142-df8a-443d-9cd0-127c672072b8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.008043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-scripts" (OuterVolumeSpecName: "scripts") pod "c86d5142-df8a-443d-9cd0-127c672072b8" (UID: "c86d5142-df8a-443d-9cd0-127c672072b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.012754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86d5142-df8a-443d-9cd0-127c672072b8-kube-api-access-466tz" (OuterVolumeSpecName: "kube-api-access-466tz") pod "c86d5142-df8a-443d-9cd0-127c672072b8" (UID: "c86d5142-df8a-443d-9cd0-127c672072b8"). InnerVolumeSpecName "kube-api-access-466tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.045456 4707 scope.go:117] "RemoveContainer" containerID="cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.059139 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c86d5142-df8a-443d-9cd0-127c672072b8" (UID: "c86d5142-df8a-443d-9cd0-127c672072b8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.073710 4707 scope.go:117] "RemoveContainer" containerID="3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.098152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c86d5142-df8a-443d-9cd0-127c672072b8" (UID: "c86d5142-df8a-443d-9cd0-127c672072b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.104841 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.104876 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-466tz\" (UniqueName: \"kubernetes.io/projected/c86d5142-df8a-443d-9cd0-127c672072b8-kube-api-access-466tz\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.104887 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.104897 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.104907 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.104916 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c86d5142-df8a-443d-9cd0-127c672072b8-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.120965 4707 scope.go:117] "RemoveContainer" containerID="9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.139033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-config-data" (OuterVolumeSpecName: "config-data") pod "c86d5142-df8a-443d-9cd0-127c672072b8" (UID: "c86d5142-df8a-443d-9cd0-127c672072b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.148769 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.152866 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-log" containerID="cri-o://1e027fad81ca7dfa8a4f749e38b8f6b282f522cfd7b1bb5d8e70b57e221edd89" gracePeriod=30 Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.153039 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-api" containerID="cri-o://46cc6c097bb1ec5c99a016d57ff85959148ea67be60e7fb165d9b2d1a56d1bef" gracePeriod=30 Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.154340 4707 scope.go:117] "RemoveContainer" containerID="4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.163762 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d\": container with ID starting with 4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d not found: ID does not exist" containerID="4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.163838 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d"} err="failed to get container status \"4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d\": rpc error: code = NotFound desc = could not find container \"4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d\": container with ID starting with 4148b38d5abf1720d831af76cbb06da1e939736b07344622d2875704cb221c8d not found: ID does not exist" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.163890 4707 scope.go:117] "RemoveContainer" containerID="cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.165742 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9\": container with ID starting with cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9 not found: ID does not exist" containerID="cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.165817 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9"} err="failed to get container status \"cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9\": rpc error: code = NotFound desc = could not find container \"cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9\": container with ID starting with cb14ad5fac5583110928bfe76096b21237685a07b83116e1c75a5e5f7ea461a9 not found: ID does not exist" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.165870 4707 scope.go:117] "RemoveContainer" containerID="3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.166249 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef\": container with ID starting with 3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef not found: ID does not exist" containerID="3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.166297 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef"} err="failed to get container status \"3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef\": rpc error: code = NotFound desc = could not find container \"3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef\": container with ID starting with 3416c1bf7e20a4b36a191807d06594a2bc25e539a69847269aa8e46a76c3a0ef not found: ID does not exist" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.166329 4707 scope.go:117] "RemoveContainer" containerID="9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.166612 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113\": container with ID starting with 9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113 not found: ID does not exist" containerID="9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.166646 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113"} err="failed to get container status \"9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113\": rpc error: code = NotFound desc = could not find container \"9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113\": container with ID starting with 9ed2ed89bba06a60ad40238aeed1d33a664c278219bc2981f9f9acfa52e2f113 not found: ID does not exist" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.176558 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.201572 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.201957 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerName="nova-metadata-log" containerID="cri-o://14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff" gracePeriod=30 Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.202128 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerName="nova-metadata-metadata" containerID="cri-o://b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444" gracePeriod=30 Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.208711 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c86d5142-df8a-443d-9cd0-127c672072b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.571920 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.584183 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.604634 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.626208 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.626769 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" containerName="dnsmasq-dns" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.626784 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" containerName="dnsmasq-dns" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.626797 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e0e811-830f-4e5c-a3bb-0dd5be01cf3d" containerName="nova-manage" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.626805 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e0e811-830f-4e5c-a3bb-0dd5be01cf3d" containerName="nova-manage" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.626821 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" containerName="init" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.626828 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" containerName="init" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.626837 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="sg-core" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.626843 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="sg-core" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.626855 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="ceilometer-central-agent" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.626861 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="ceilometer-central-agent" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.626888 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ededf3cb-8e51-4ba6-a3d1-dc12980e13d1" containerName="nova-cell1-conductor-db-sync" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.626895 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ededf3cb-8e51-4ba6-a3d1-dc12980e13d1" containerName="nova-cell1-conductor-db-sync" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.626912 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="ceilometer-notification-agent" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.626921 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="ceilometer-notification-agent" Jan 29 03:49:14 crc kubenswrapper[4707]: E0129 03:49:14.626934 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="proxy-httpd" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.626940 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="proxy-httpd" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.627187 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="proxy-httpd" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.627209 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="ceilometer-central-agent" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.627219 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a7e44d-1a9d-432f-a68b-d4cd671fef9f" containerName="dnsmasq-dns" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.627232 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="sg-core" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.627247 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" containerName="ceilometer-notification-agent" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.627261 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ededf3cb-8e51-4ba6-a3d1-dc12980e13d1" containerName="nova-cell1-conductor-db-sync" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.627269 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e0e811-830f-4e5c-a3bb-0dd5be01cf3d" containerName="nova-manage" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.629405 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.635028 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.635270 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.635452 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.645625 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.645692 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.662347 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.720626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-config-data\") pod \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.720787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcmdm\" (UniqueName: \"kubernetes.io/projected/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-kube-api-access-vcmdm\") pod \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.720896 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-scripts\") pod \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.725331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-combined-ca-bundle\") pod \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\" (UID: \"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.727000 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-config-data\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.727090 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx28c\" (UniqueName: \"kubernetes.io/projected/f9871425-141a-4770-9a04-117e63870be7-kube-api-access-rx28c\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.727179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-scripts\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.727222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.727331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.727410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.727512 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-log-httpd\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.733117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-run-httpd\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.740730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-scripts" (OuterVolumeSpecName: "scripts") pod "ededf3cb-8e51-4ba6-a3d1-dc12980e13d1" (UID: "ededf3cb-8e51-4ba6-a3d1-dc12980e13d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.748879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-kube-api-access-vcmdm" (OuterVolumeSpecName: "kube-api-access-vcmdm") pod "ededf3cb-8e51-4ba6-a3d1-dc12980e13d1" (UID: "ededf3cb-8e51-4ba6-a3d1-dc12980e13d1"). InnerVolumeSpecName "kube-api-access-vcmdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.760649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-config-data" (OuterVolumeSpecName: "config-data") pod "ededf3cb-8e51-4ba6-a3d1-dc12980e13d1" (UID: "ededf3cb-8e51-4ba6-a3d1-dc12980e13d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.778774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ededf3cb-8e51-4ba6-a3d1-dc12980e13d1" (UID: "ededf3cb-8e51-4ba6-a3d1-dc12980e13d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.794653 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.842239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-run-httpd\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.842515 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-config-data\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.842970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-run-httpd\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.843242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx28c\" (UniqueName: \"kubernetes.io/projected/f9871425-141a-4770-9a04-117e63870be7-kube-api-access-rx28c\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.843306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-scripts\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.843327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.843384 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.843406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.843457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-log-httpd\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.846236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-log-httpd\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.848941 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcmdm\" (UniqueName: \"kubernetes.io/projected/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-kube-api-access-vcmdm\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.848970 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.848980 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.848990 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.851170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-config-data\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.851374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.853516 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.854041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-scripts\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.855886 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.861251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx28c\" (UniqueName: \"kubernetes.io/projected/f9871425-141a-4770-9a04-117e63870be7-kube-api-access-rx28c\") pod \"ceilometer-0\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.950250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92e3a65-8435-42f4-bd47-2c0b79439293-logs\") pod \"e92e3a65-8435-42f4-bd47-2c0b79439293\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.950344 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-combined-ca-bundle\") pod \"e92e3a65-8435-42f4-bd47-2c0b79439293\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.950485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdb8m\" (UniqueName: \"kubernetes.io/projected/e92e3a65-8435-42f4-bd47-2c0b79439293-kube-api-access-pdb8m\") pod \"e92e3a65-8435-42f4-bd47-2c0b79439293\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.950512 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-config-data\") pod \"e92e3a65-8435-42f4-bd47-2c0b79439293\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.950652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e92e3a65-8435-42f4-bd47-2c0b79439293-logs" (OuterVolumeSpecName: "logs") pod "e92e3a65-8435-42f4-bd47-2c0b79439293" (UID: "e92e3a65-8435-42f4-bd47-2c0b79439293"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.951164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-nova-metadata-tls-certs\") pod \"e92e3a65-8435-42f4-bd47-2c0b79439293\" (UID: \"e92e3a65-8435-42f4-bd47-2c0b79439293\") " Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.951651 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92e3a65-8435-42f4-bd47-2c0b79439293-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.955043 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92e3a65-8435-42f4-bd47-2c0b79439293-kube-api-access-pdb8m" (OuterVolumeSpecName: "kube-api-access-pdb8m") pod "e92e3a65-8435-42f4-bd47-2c0b79439293" (UID: "e92e3a65-8435-42f4-bd47-2c0b79439293"). InnerVolumeSpecName "kube-api-access-pdb8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.967614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:49:14 crc kubenswrapper[4707]: I0129 03:49:14.979715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-config-data" (OuterVolumeSpecName: "config-data") pod "e92e3a65-8435-42f4-bd47-2c0b79439293" (UID: "e92e3a65-8435-42f4-bd47-2c0b79439293"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.000959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e92e3a65-8435-42f4-bd47-2c0b79439293" (UID: "e92e3a65-8435-42f4-bd47-2c0b79439293"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.018944 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5vqq2" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.019238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5vqq2" event={"ID":"ededf3cb-8e51-4ba6-a3d1-dc12980e13d1","Type":"ContainerDied","Data":"06b8db512e65f764adedd7be3a2c5e1351e86f7ef58e696a6fc88e7d139479d6"} Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.019350 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b8db512e65f764adedd7be3a2c5e1351e86f7ef58e696a6fc88e7d139479d6" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.024703 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e92e3a65-8435-42f4-bd47-2c0b79439293" (UID: "e92e3a65-8435-42f4-bd47-2c0b79439293"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.042119 4707 generic.go:334] "Generic (PLEG): container finished" podID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerID="1e027fad81ca7dfa8a4f749e38b8f6b282f522cfd7b1bb5d8e70b57e221edd89" exitCode=143 Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.042334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3df36f3e-f1ff-4a7a-94f5-251e977949cc","Type":"ContainerDied","Data":"1e027fad81ca7dfa8a4f749e38b8f6b282f522cfd7b1bb5d8e70b57e221edd89"} Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.049246 4707 generic.go:334] "Generic (PLEG): container finished" podID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerID="b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444" exitCode=0 Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.049289 4707 generic.go:334] "Generic (PLEG): container finished" podID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerID="14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff" exitCode=143 Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.049481 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="654bc6e4-18a2-4da6-a30a-9af36c8df0ec" containerName="nova-scheduler-scheduler" containerID="cri-o://31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c" gracePeriod=30 Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.049971 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.052634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e92e3a65-8435-42f4-bd47-2c0b79439293","Type":"ContainerDied","Data":"b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444"} Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.052691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e92e3a65-8435-42f4-bd47-2c0b79439293","Type":"ContainerDied","Data":"14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff"} Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.052707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e92e3a65-8435-42f4-bd47-2c0b79439293","Type":"ContainerDied","Data":"823aa1334b73bba19a453c46458739a7d50363763dcaccbb73d9c36e24cb92d9"} Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.052724 4707 scope.go:117] "RemoveContainer" containerID="b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.053941 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.053958 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdb8m\" (UniqueName: \"kubernetes.io/projected/e92e3a65-8435-42f4-bd47-2c0b79439293-kube-api-access-pdb8m\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.053969 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.053980 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e92e3a65-8435-42f4-bd47-2c0b79439293-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.069847 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 03:49:15 crc kubenswrapper[4707]: E0129 03:49:15.070557 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerName="nova-metadata-metadata" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.070576 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerName="nova-metadata-metadata" Jan 29 03:49:15 crc kubenswrapper[4707]: E0129 03:49:15.070697 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerName="nova-metadata-log" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.070708 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerName="nova-metadata-log" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.071030 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerName="nova-metadata-log" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.071063 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92e3a65-8435-42f4-bd47-2c0b79439293" containerName="nova-metadata-metadata" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.072098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.076999 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.114831 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.140577 4707 scope.go:117] "RemoveContainer" containerID="14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.156762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47acef2f-13c3-47cd-b61f-a65e20f570a4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"47acef2f-13c3-47cd-b61f-a65e20f570a4\") " pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.156925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47acef2f-13c3-47cd-b61f-a65e20f570a4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"47acef2f-13c3-47cd-b61f-a65e20f570a4\") " pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.156973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhz9t\" (UniqueName: \"kubernetes.io/projected/47acef2f-13c3-47cd-b61f-a65e20f570a4-kube-api-access-xhz9t\") pod \"nova-cell1-conductor-0\" (UID: \"47acef2f-13c3-47cd-b61f-a65e20f570a4\") " pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.161101 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.181439 4707 scope.go:117] "RemoveContainer" containerID="b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444" Jan 29 03:49:15 crc kubenswrapper[4707]: E0129 03:49:15.183078 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444\": container with ID starting with b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444 not found: ID does not exist" containerID="b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.183115 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444"} err="failed to get container status \"b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444\": rpc error: code = NotFound desc = could not find container \"b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444\": container with ID starting with b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444 not found: ID does not exist" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.183144 4707 scope.go:117] "RemoveContainer" containerID="14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff" Jan 29 03:49:15 crc kubenswrapper[4707]: E0129 03:49:15.183736 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff\": container with ID starting with 14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff not found: ID does not exist" containerID="14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.183757 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff"} err="failed to get container status \"14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff\": rpc error: code = NotFound desc = could not find container \"14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff\": container with ID starting with 14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff not found: ID does not exist" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.183773 4707 scope.go:117] "RemoveContainer" containerID="b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.187099 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444"} err="failed to get container status \"b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444\": rpc error: code = NotFound desc = could not find container \"b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444\": container with ID starting with b772621a288a1c820ecc47cf5a5716779a8b81e73979992dead810d797778444 not found: ID does not exist" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.187121 4707 scope.go:117] "RemoveContainer" containerID="14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.187195 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.187372 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff"} err="failed to get container status \"14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff\": rpc error: code = NotFound desc = could not find container \"14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff\": container with ID starting with 14f4ce3acb66ae5a7aa0e6083dc93a29136b5121dc128087dfb7b0914a13c8ff not found: ID does not exist" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.198938 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.200984 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.203643 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.204344 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.204899 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.258028 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86d5142-df8a-443d-9cd0-127c672072b8" path="/var/lib/kubelet/pods/c86d5142-df8a-443d-9cd0-127c672072b8/volumes" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.258848 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e92e3a65-8435-42f4-bd47-2c0b79439293" path="/var/lib/kubelet/pods/e92e3a65-8435-42f4-bd47-2c0b79439293/volumes" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.276709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47acef2f-13c3-47cd-b61f-a65e20f570a4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"47acef2f-13c3-47cd-b61f-a65e20f570a4\") " pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.277056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47acef2f-13c3-47cd-b61f-a65e20f570a4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"47acef2f-13c3-47cd-b61f-a65e20f570a4\") " pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.277163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhz9t\" (UniqueName: \"kubernetes.io/projected/47acef2f-13c3-47cd-b61f-a65e20f570a4-kube-api-access-xhz9t\") pod \"nova-cell1-conductor-0\" (UID: \"47acef2f-13c3-47cd-b61f-a65e20f570a4\") " pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.285632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47acef2f-13c3-47cd-b61f-a65e20f570a4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"47acef2f-13c3-47cd-b61f-a65e20f570a4\") " pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.296983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhz9t\" (UniqueName: \"kubernetes.io/projected/47acef2f-13c3-47cd-b61f-a65e20f570a4-kube-api-access-xhz9t\") pod \"nova-cell1-conductor-0\" (UID: \"47acef2f-13c3-47cd-b61f-a65e20f570a4\") " pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.303636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47acef2f-13c3-47cd-b61f-a65e20f570a4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"47acef2f-13c3-47cd-b61f-a65e20f570a4\") " pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.381051 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-config-data\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.381133 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.381166 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkzj\" (UniqueName: \"kubernetes.io/projected/1d07ad17-c85d-4890-9636-b02564c1c482-kube-api-access-skkzj\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.381192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.381312 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d07ad17-c85d-4890-9636-b02564c1c482-logs\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.394521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.482920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-config-data\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.482979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.483001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skkzj\" (UniqueName: \"kubernetes.io/projected/1d07ad17-c85d-4890-9636-b02564c1c482-kube-api-access-skkzj\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.483022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.483112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d07ad17-c85d-4890-9636-b02564c1c482-logs\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.483591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d07ad17-c85d-4890-9636-b02564c1c482-logs\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.492775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.497300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-config-data\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.505065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.510395 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkzj\" (UniqueName: \"kubernetes.io/projected/1d07ad17-c85d-4890-9636-b02564c1c482-kube-api-access-skkzj\") pod \"nova-metadata-0\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.527903 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.549479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:15 crc kubenswrapper[4707]: W0129 03:49:15.874822 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47acef2f_13c3_47cd_b61f_a65e20f570a4.slice/crio-e13e8665bd7764ab643d6a43b1d75cfcae8fb39225ee1c09cf61326a9c572b2d WatchSource:0}: Error finding container e13e8665bd7764ab643d6a43b1d75cfcae8fb39225ee1c09cf61326a9c572b2d: Status 404 returned error can't find the container with id e13e8665bd7764ab643d6a43b1d75cfcae8fb39225ee1c09cf61326a9c572b2d Jan 29 03:49:15 crc kubenswrapper[4707]: I0129 03:49:15.875098 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 03:49:15 crc kubenswrapper[4707]: E0129 03:49:15.931788 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 03:49:15 crc kubenswrapper[4707]: E0129 03:49:15.934107 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 03:49:15 crc kubenswrapper[4707]: E0129 03:49:15.936317 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 03:49:15 crc kubenswrapper[4707]: E0129 03:49:15.936435 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="654bc6e4-18a2-4da6-a30a-9af36c8df0ec" containerName="nova-scheduler-scheduler" Jan 29 03:49:16 crc kubenswrapper[4707]: I0129 03:49:16.054841 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:16 crc kubenswrapper[4707]: I0129 03:49:16.066952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerStarted","Data":"38f26552e2e3243ab7a6088945d74242b62ce730bc39ed5d6e3aad0f06cf1191"} Jan 29 03:49:16 crc kubenswrapper[4707]: I0129 03:49:16.068619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"47acef2f-13c3-47cd-b61f-a65e20f570a4","Type":"ContainerStarted","Data":"e13e8665bd7764ab643d6a43b1d75cfcae8fb39225ee1c09cf61326a9c572b2d"} Jan 29 03:49:16 crc kubenswrapper[4707]: W0129 03:49:16.070664 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d07ad17_c85d_4890_9636_b02564c1c482.slice/crio-e80767a19ee00e9afb99ceff65a262873c3633974b90e7602c15d72b960e6553 WatchSource:0}: Error finding container e80767a19ee00e9afb99ceff65a262873c3633974b90e7602c15d72b960e6553: Status 404 returned error can't find the container with id e80767a19ee00e9afb99ceff65a262873c3633974b90e7602c15d72b960e6553 Jan 29 03:49:17 crc kubenswrapper[4707]: I0129 03:49:17.080323 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"47acef2f-13c3-47cd-b61f-a65e20f570a4","Type":"ContainerStarted","Data":"3acbee3bcb5bed9b6d0b2a45dcb5f81e03369e5c8e5fd5c15b2115401043e3aa"} Jan 29 03:49:17 crc kubenswrapper[4707]: I0129 03:49:17.080833 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:17 crc kubenswrapper[4707]: I0129 03:49:17.083633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerStarted","Data":"c7a9c7c28622182a68e0347b5c8ed61fd9bb1d875920288d0d6eb9a6d6138614"} Jan 29 03:49:17 crc kubenswrapper[4707]: I0129 03:49:17.083696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerStarted","Data":"5076de70b6044ddae46d7717f21d2469b924a0e3bb1eaa052aeaf068f5a156c0"} Jan 29 03:49:17 crc kubenswrapper[4707]: I0129 03:49:17.086370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d07ad17-c85d-4890-9636-b02564c1c482","Type":"ContainerStarted","Data":"ec94449a991d649c47094022ec2cbc29578fe96f14d79940db68fa092ec2d267"} Jan 29 03:49:17 crc kubenswrapper[4707]: I0129 03:49:17.086432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d07ad17-c85d-4890-9636-b02564c1c482","Type":"ContainerStarted","Data":"3bbd1c6ecf561f8ae2565973ac9dc34d83ec9e52f04b968f70bb36b68ea9b7e8"} Jan 29 03:49:17 crc kubenswrapper[4707]: I0129 03:49:17.086447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d07ad17-c85d-4890-9636-b02564c1c482","Type":"ContainerStarted","Data":"e80767a19ee00e9afb99ceff65a262873c3633974b90e7602c15d72b960e6553"} Jan 29 03:49:17 crc kubenswrapper[4707]: I0129 03:49:17.105863 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.10584267 podStartE2EDuration="2.10584267s" podCreationTimestamp="2026-01-29 03:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:17.099775267 +0000 UTC m=+1310.584004162" watchObservedRunningTime="2026-01-29 03:49:17.10584267 +0000 UTC m=+1310.590071575" Jan 29 03:49:17 crc kubenswrapper[4707]: I0129 03:49:17.150894 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.150869427 podStartE2EDuration="2.150869427s" podCreationTimestamp="2026-01-29 03:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:17.133059788 +0000 UTC m=+1310.617288693" watchObservedRunningTime="2026-01-29 03:49:17.150869427 +0000 UTC m=+1310.635098332" Jan 29 03:49:18 crc kubenswrapper[4707]: I0129 03:49:18.098762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerStarted","Data":"2a90035c9d58389248e6fbe14d378c8c435ab2fb3e7c7171988d37c8a2bbdc11"} Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.112507 4707 generic.go:334] "Generic (PLEG): container finished" podID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerID="46cc6c097bb1ec5c99a016d57ff85959148ea67be60e7fb165d9b2d1a56d1bef" exitCode=0 Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.112569 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3df36f3e-f1ff-4a7a-94f5-251e977949cc","Type":"ContainerDied","Data":"46cc6c097bb1ec5c99a016d57ff85959148ea67be60e7fb165d9b2d1a56d1bef"} Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.113012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3df36f3e-f1ff-4a7a-94f5-251e977949cc","Type":"ContainerDied","Data":"3085370e438d89cc2efb6316eef504add112e056341b1a2d7e3db3aa29afc2f6"} Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.113033 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3085370e438d89cc2efb6316eef504add112e056341b1a2d7e3db3aa29afc2f6" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.260688 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.351497 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.395687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-config-data\") pod \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.395750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp7rv\" (UniqueName: \"kubernetes.io/projected/3df36f3e-f1ff-4a7a-94f5-251e977949cc-kube-api-access-dp7rv\") pod \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.395927 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-combined-ca-bundle\") pod \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.395995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df36f3e-f1ff-4a7a-94f5-251e977949cc-logs\") pod \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\" (UID: \"3df36f3e-f1ff-4a7a-94f5-251e977949cc\") " Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.402167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df36f3e-f1ff-4a7a-94f5-251e977949cc-logs" (OuterVolumeSpecName: "logs") pod "3df36f3e-f1ff-4a7a-94f5-251e977949cc" (UID: "3df36f3e-f1ff-4a7a-94f5-251e977949cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.410889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df36f3e-f1ff-4a7a-94f5-251e977949cc-kube-api-access-dp7rv" (OuterVolumeSpecName: "kube-api-access-dp7rv") pod "3df36f3e-f1ff-4a7a-94f5-251e977949cc" (UID: "3df36f3e-f1ff-4a7a-94f5-251e977949cc"). InnerVolumeSpecName "kube-api-access-dp7rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.432659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3df36f3e-f1ff-4a7a-94f5-251e977949cc" (UID: "3df36f3e-f1ff-4a7a-94f5-251e977949cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.453033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-config-data" (OuterVolumeSpecName: "config-data") pod "3df36f3e-f1ff-4a7a-94f5-251e977949cc" (UID: "3df36f3e-f1ff-4a7a-94f5-251e977949cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.498986 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.499030 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp7rv\" (UniqueName: \"kubernetes.io/projected/3df36f3e-f1ff-4a7a-94f5-251e977949cc-kube-api-access-dp7rv\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.499046 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df36f3e-f1ff-4a7a-94f5-251e977949cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.499060 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df36f3e-f1ff-4a7a-94f5-251e977949cc-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.803417 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.907146 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-combined-ca-bundle\") pod \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.907237 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfdlz\" (UniqueName: \"kubernetes.io/projected/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-kube-api-access-lfdlz\") pod \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.907329 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-config-data\") pod \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\" (UID: \"654bc6e4-18a2-4da6-a30a-9af36c8df0ec\") " Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.914878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-kube-api-access-lfdlz" (OuterVolumeSpecName: "kube-api-access-lfdlz") pod "654bc6e4-18a2-4da6-a30a-9af36c8df0ec" (UID: "654bc6e4-18a2-4da6-a30a-9af36c8df0ec"). InnerVolumeSpecName "kube-api-access-lfdlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.976526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-config-data" (OuterVolumeSpecName: "config-data") pod "654bc6e4-18a2-4da6-a30a-9af36c8df0ec" (UID: "654bc6e4-18a2-4da6-a30a-9af36c8df0ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:19 crc kubenswrapper[4707]: I0129 03:49:19.986408 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "654bc6e4-18a2-4da6-a30a-9af36c8df0ec" (UID: "654bc6e4-18a2-4da6-a30a-9af36c8df0ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.009858 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.010325 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfdlz\" (UniqueName: \"kubernetes.io/projected/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-kube-api-access-lfdlz\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.010478 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654bc6e4-18a2-4da6-a30a-9af36c8df0ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.126137 4707 generic.go:334] "Generic (PLEG): container finished" podID="654bc6e4-18a2-4da6-a30a-9af36c8df0ec" containerID="31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c" exitCode=0 Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.126256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"654bc6e4-18a2-4da6-a30a-9af36c8df0ec","Type":"ContainerDied","Data":"31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c"} Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.126305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"654bc6e4-18a2-4da6-a30a-9af36c8df0ec","Type":"ContainerDied","Data":"c9ced0953b0dd37cabdcb8cce222d34a69e6525c03964d0877a4af69f29ecc4f"} Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.126356 4707 scope.go:117] "RemoveContainer" containerID="31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.127617 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.129289 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.131373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerStarted","Data":"f4c4f53ac759b12452272de5a2372b7fd8039b269e9accde2d96864233c99d72"} Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.131422 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.150250 4707 scope.go:117] "RemoveContainer" containerID="31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c" Jan 29 03:49:20 crc kubenswrapper[4707]: E0129 03:49:20.150875 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c\": container with ID starting with 31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c not found: ID does not exist" containerID="31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.150928 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c"} err="failed to get container status \"31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c\": rpc error: code = NotFound desc = could not find container \"31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c\": container with ID starting with 31c2113ad36919a373d667a44c1972683c6d37681cb5a0ddfe029b03e2cff91c not found: ID does not exist" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.166209 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.578648438 podStartE2EDuration="6.166186379s" podCreationTimestamp="2026-01-29 03:49:14 +0000 UTC" firstStartedPulling="2026-01-29 03:49:15.56303119 +0000 UTC m=+1309.047260095" lastFinishedPulling="2026-01-29 03:49:19.150569131 +0000 UTC m=+1312.634798036" observedRunningTime="2026-01-29 03:49:20.159610631 +0000 UTC m=+1313.643839536" watchObservedRunningTime="2026-01-29 03:49:20.166186379 +0000 UTC m=+1313.650415284" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.192328 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.209099 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.223724 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.236382 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.254177 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:20 crc kubenswrapper[4707]: E0129 03:49:20.254713 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-api" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.254734 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-api" Jan 29 03:49:20 crc kubenswrapper[4707]: E0129 03:49:20.254758 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-log" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.254771 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-log" Jan 29 03:49:20 crc kubenswrapper[4707]: E0129 03:49:20.254786 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654bc6e4-18a2-4da6-a30a-9af36c8df0ec" containerName="nova-scheduler-scheduler" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.254792 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="654bc6e4-18a2-4da6-a30a-9af36c8df0ec" containerName="nova-scheduler-scheduler" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.255019 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-log" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.255045 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="654bc6e4-18a2-4da6-a30a-9af36c8df0ec" containerName="nova-scheduler-scheduler" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.255066 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" containerName="nova-api-api" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.255825 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.259501 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.265050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.279673 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.281797 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.284933 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.290177 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.420147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.420355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6zrq\" (UniqueName: \"kubernetes.io/projected/678597fa-695b-4d44-aa1c-12127cac6804-kube-api-access-j6zrq\") pod \"nova-scheduler-0\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.420561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6cnm\" (UniqueName: \"kubernetes.io/projected/1ed1f0e8-5f4f-428b-90ba-26a19000f166-kube-api-access-r6cnm\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.420970 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.421503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed1f0e8-5f4f-428b-90ba-26a19000f166-logs\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.421664 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-config-data\") pod \"nova-scheduler-0\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.421750 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-config-data\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.524061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.524164 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6zrq\" (UniqueName: \"kubernetes.io/projected/678597fa-695b-4d44-aa1c-12127cac6804-kube-api-access-j6zrq\") pod \"nova-scheduler-0\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.524209 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6cnm\" (UniqueName: \"kubernetes.io/projected/1ed1f0e8-5f4f-428b-90ba-26a19000f166-kube-api-access-r6cnm\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.524588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.524740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed1f0e8-5f4f-428b-90ba-26a19000f166-logs\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.525423 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed1f0e8-5f4f-428b-90ba-26a19000f166-logs\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.525524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-config-data\") pod \"nova-scheduler-0\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.526120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-config-data\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.528334 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.528640 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.528756 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.529036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-config-data\") pod \"nova-scheduler-0\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.530933 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.533189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-config-data\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.542533 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6cnm\" (UniqueName: \"kubernetes.io/projected/1ed1f0e8-5f4f-428b-90ba-26a19000f166-kube-api-access-r6cnm\") pod \"nova-api-0\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " pod="openstack/nova-api-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.548186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6zrq\" (UniqueName: \"kubernetes.io/projected/678597fa-695b-4d44-aa1c-12127cac6804-kube-api-access-j6zrq\") pod \"nova-scheduler-0\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.582144 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:49:20 crc kubenswrapper[4707]: I0129 03:49:20.603951 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:21 crc kubenswrapper[4707]: I0129 03:49:21.152017 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:21 crc kubenswrapper[4707]: I0129 03:49:21.224030 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:21 crc kubenswrapper[4707]: I0129 03:49:21.262796 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df36f3e-f1ff-4a7a-94f5-251e977949cc" path="/var/lib/kubelet/pods/3df36f3e-f1ff-4a7a-94f5-251e977949cc/volumes" Jan 29 03:49:21 crc kubenswrapper[4707]: I0129 03:49:21.265501 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="654bc6e4-18a2-4da6-a30a-9af36c8df0ec" path="/var/lib/kubelet/pods/654bc6e4-18a2-4da6-a30a-9af36c8df0ec/volumes" Jan 29 03:49:22 crc kubenswrapper[4707]: I0129 03:49:22.156646 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1ed1f0e8-5f4f-428b-90ba-26a19000f166","Type":"ContainerStarted","Data":"3d252085479894cf4e0f02122957d366fd45a72a9dde23d18b3eb7b0e89a67f0"} Jan 29 03:49:22 crc kubenswrapper[4707]: I0129 03:49:22.158237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1ed1f0e8-5f4f-428b-90ba-26a19000f166","Type":"ContainerStarted","Data":"a44b03bc9e7ce865da34a49641ef9b98751a9f33344af3824f6e66ca0aef7cde"} Jan 29 03:49:22 crc kubenswrapper[4707]: I0129 03:49:22.158342 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1ed1f0e8-5f4f-428b-90ba-26a19000f166","Type":"ContainerStarted","Data":"783093f2c11430e6823e9127fd59e49327fb41a34ac1857371e350d27a6826a6"} Jan 29 03:49:22 crc kubenswrapper[4707]: I0129 03:49:22.161066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"678597fa-695b-4d44-aa1c-12127cac6804","Type":"ContainerStarted","Data":"ed65b8105d3ec214fb32dc626a13214ac153b2773c9490eff8ebf854edbb75d5"} Jan 29 03:49:22 crc kubenswrapper[4707]: I0129 03:49:22.161118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"678597fa-695b-4d44-aa1c-12127cac6804","Type":"ContainerStarted","Data":"c24618c3b94339ac12dd4dfa36aafd2a10f2dfbc07615d2a0fd577dd8c12a976"} Jan 29 03:49:22 crc kubenswrapper[4707]: I0129 03:49:22.189670 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.189643672 podStartE2EDuration="2.189643672s" podCreationTimestamp="2026-01-29 03:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:22.178635087 +0000 UTC m=+1315.662864022" watchObservedRunningTime="2026-01-29 03:49:22.189643672 +0000 UTC m=+1315.673872587" Jan 29 03:49:22 crc kubenswrapper[4707]: I0129 03:49:22.212265 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.212244418 podStartE2EDuration="2.212244418s" podCreationTimestamp="2026-01-29 03:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:22.209992773 +0000 UTC m=+1315.694221708" watchObservedRunningTime="2026-01-29 03:49:22.212244418 +0000 UTC m=+1315.696473333" Jan 29 03:49:25 crc kubenswrapper[4707]: I0129 03:49:25.437022 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 29 03:49:25 crc kubenswrapper[4707]: I0129 03:49:25.529221 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 03:49:25 crc kubenswrapper[4707]: I0129 03:49:25.529286 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 03:49:25 crc kubenswrapper[4707]: I0129 03:49:25.584687 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 03:49:26 crc kubenswrapper[4707]: I0129 03:49:26.612854 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 03:49:26 crc kubenswrapper[4707]: I0129 03:49:26.612860 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 03:49:30 crc kubenswrapper[4707]: I0129 03:49:30.583277 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 03:49:30 crc kubenswrapper[4707]: I0129 03:49:30.604519 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 03:49:30 crc kubenswrapper[4707]: I0129 03:49:30.604613 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 03:49:30 crc kubenswrapper[4707]: I0129 03:49:30.625072 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 03:49:31 crc kubenswrapper[4707]: I0129 03:49:31.301511 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 03:49:31 crc kubenswrapper[4707]: I0129 03:49:31.687689 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 03:49:31 crc kubenswrapper[4707]: I0129 03:49:31.687689 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 03:49:33 crc kubenswrapper[4707]: I0129 03:49:33.463394 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:49:33 crc kubenswrapper[4707]: I0129 03:49:33.463484 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:49:35 crc kubenswrapper[4707]: I0129 03:49:35.534479 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 03:49:35 crc kubenswrapper[4707]: I0129 03:49:35.535152 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 03:49:35 crc kubenswrapper[4707]: I0129 03:49:35.540717 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 03:49:35 crc kubenswrapper[4707]: I0129 03:49:35.542419 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.190584 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.344249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x754\" (UniqueName: \"kubernetes.io/projected/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-kube-api-access-7x754\") pod \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.344472 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-combined-ca-bundle\") pod \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.344518 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-config-data\") pod \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\" (UID: \"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f\") " Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.351961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-kube-api-access-7x754" (OuterVolumeSpecName: "kube-api-access-7x754") pod "bc59e1a6-5382-4b11-86e1-1d5bb3864a6f" (UID: "bc59e1a6-5382-4b11-86e1-1d5bb3864a6f"). InnerVolumeSpecName "kube-api-access-7x754". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.359348 4707 generic.go:334] "Generic (PLEG): container finished" podID="bc59e1a6-5382-4b11-86e1-1d5bb3864a6f" containerID="b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc" exitCode=137 Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.359575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f","Type":"ContainerDied","Data":"b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc"} Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.359640 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bc59e1a6-5382-4b11-86e1-1d5bb3864a6f","Type":"ContainerDied","Data":"fc46cb4a430b2a5abd089ff397272a996b1f40bfd6c4baabe044e737391658a5"} Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.359668 4707 scope.go:117] "RemoveContainer" containerID="b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.359969 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.380229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc59e1a6-5382-4b11-86e1-1d5bb3864a6f" (UID: "bc59e1a6-5382-4b11-86e1-1d5bb3864a6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.389549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-config-data" (OuterVolumeSpecName: "config-data") pod "bc59e1a6-5382-4b11-86e1-1d5bb3864a6f" (UID: "bc59e1a6-5382-4b11-86e1-1d5bb3864a6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.448451 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.448545 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.448603 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x754\" (UniqueName: \"kubernetes.io/projected/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f-kube-api-access-7x754\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.470347 4707 scope.go:117] "RemoveContainer" containerID="b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc" Jan 29 03:49:37 crc kubenswrapper[4707]: E0129 03:49:37.471148 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc\": container with ID starting with b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc not found: ID does not exist" containerID="b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.471216 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc"} err="failed to get container status \"b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc\": rpc error: code = NotFound desc = could not find container \"b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc\": container with ID starting with b993739093bd9fedade7ff84aee7ccab6cf563f2eee591b1956dfd449fa1efdc not found: ID does not exist" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.744293 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.764047 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.782470 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 03:49:37 crc kubenswrapper[4707]: E0129 03:49:37.783364 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc59e1a6-5382-4b11-86e1-1d5bb3864a6f" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.783444 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc59e1a6-5382-4b11-86e1-1d5bb3864a6f" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.783760 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc59e1a6-5382-4b11-86e1-1d5bb3864a6f" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.784754 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.789685 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.789727 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.790973 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.816203 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.965437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.965549 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vft8r\" (UniqueName: \"kubernetes.io/projected/99c8a2aa-31c2-4927-af04-8f5e8c50198e-kube-api-access-vft8r\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.965881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.966093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:37 crc kubenswrapper[4707]: I0129 03:49:37.966140 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.068763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.068832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.068898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.068971 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vft8r\" (UniqueName: \"kubernetes.io/projected/99c8a2aa-31c2-4927-af04-8f5e8c50198e-kube-api-access-vft8r\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.069035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.075817 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.077774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.078391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.079673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/99c8a2aa-31c2-4927-af04-8f5e8c50198e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.095091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vft8r\" (UniqueName: \"kubernetes.io/projected/99c8a2aa-31c2-4927-af04-8f5e8c50198e-kube-api-access-vft8r\") pod \"nova-cell1-novncproxy-0\" (UID: \"99c8a2aa-31c2-4927-af04-8f5e8c50198e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.115424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:38 crc kubenswrapper[4707]: I0129 03:49:38.589201 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 03:49:39 crc kubenswrapper[4707]: I0129 03:49:39.259872 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc59e1a6-5382-4b11-86e1-1d5bb3864a6f" path="/var/lib/kubelet/pods/bc59e1a6-5382-4b11-86e1-1d5bb3864a6f/volumes" Jan 29 03:49:39 crc kubenswrapper[4707]: I0129 03:49:39.395050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"99c8a2aa-31c2-4927-af04-8f5e8c50198e","Type":"ContainerStarted","Data":"57d342322692352266d62e605b28d10cda1fcb01f648e7194d6f7a36cda8b9c3"} Jan 29 03:49:39 crc kubenswrapper[4707]: I0129 03:49:39.395123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"99c8a2aa-31c2-4927-af04-8f5e8c50198e","Type":"ContainerStarted","Data":"771ffcb1af4bc90607424889796941daf8acffa60b1a1eef7f32f44b7c9419ee"} Jan 29 03:49:39 crc kubenswrapper[4707]: I0129 03:49:39.432883 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.432856632 podStartE2EDuration="2.432856632s" podCreationTimestamp="2026-01-29 03:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:39.42300178 +0000 UTC m=+1332.907230685" watchObservedRunningTime="2026-01-29 03:49:39.432856632 +0000 UTC m=+1332.917085547" Jan 29 03:49:40 crc kubenswrapper[4707]: I0129 03:49:40.609570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 03:49:40 crc kubenswrapper[4707]: I0129 03:49:40.610677 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 03:49:40 crc kubenswrapper[4707]: I0129 03:49:40.611091 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 03:49:40 crc kubenswrapper[4707]: I0129 03:49:40.611125 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 03:49:40 crc kubenswrapper[4707]: I0129 03:49:40.614971 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 03:49:40 crc kubenswrapper[4707]: I0129 03:49:40.620719 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 03:49:40 crc kubenswrapper[4707]: I0129 03:49:40.853995 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q"] Jan 29 03:49:40 crc kubenswrapper[4707]: I0129 03:49:40.864513 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:40 crc kubenswrapper[4707]: I0129 03:49:40.884708 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q"] Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.056766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57g28\" (UniqueName: \"kubernetes.io/projected/c3cea440-5000-4c97-96ab-6436f2a69e02-kube-api-access-57g28\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.056896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.057708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.057776 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.057891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-config\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.057966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.160687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57g28\" (UniqueName: \"kubernetes.io/projected/c3cea440-5000-4c97-96ab-6436f2a69e02-kube-api-access-57g28\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.160853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.160902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.160994 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.161240 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-config\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.161284 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.162007 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.162366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-config\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.162366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.162567 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.163258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.195010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57g28\" (UniqueName: \"kubernetes.io/projected/c3cea440-5000-4c97-96ab-6436f2a69e02-kube-api-access-57g28\") pod \"dnsmasq-dns-6b7bbf7cf9-hzv6q\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.195770 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:41 crc kubenswrapper[4707]: W0129 03:49:41.715259 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3cea440_5000_4c97_96ab_6436f2a69e02.slice/crio-b73f6c28a966b865269e061eaed71a049eb8806b4d38467f011e2cfa1eac16ce WatchSource:0}: Error finding container b73f6c28a966b865269e061eaed71a049eb8806b4d38467f011e2cfa1eac16ce: Status 404 returned error can't find the container with id b73f6c28a966b865269e061eaed71a049eb8806b4d38467f011e2cfa1eac16ce Jan 29 03:49:41 crc kubenswrapper[4707]: I0129 03:49:41.718426 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q"] Jan 29 03:49:42 crc kubenswrapper[4707]: I0129 03:49:42.442074 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3cea440-5000-4c97-96ab-6436f2a69e02" containerID="c2799b8df9451fead2668b0350f816d18b6885f5f3533e4cbc943f7943c22b99" exitCode=0 Jan 29 03:49:42 crc kubenswrapper[4707]: I0129 03:49:42.442171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" event={"ID":"c3cea440-5000-4c97-96ab-6436f2a69e02","Type":"ContainerDied","Data":"c2799b8df9451fead2668b0350f816d18b6885f5f3533e4cbc943f7943c22b99"} Jan 29 03:49:42 crc kubenswrapper[4707]: I0129 03:49:42.442737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" event={"ID":"c3cea440-5000-4c97-96ab-6436f2a69e02","Type":"ContainerStarted","Data":"b73f6c28a966b865269e061eaed71a049eb8806b4d38467f011e2cfa1eac16ce"} Jan 29 03:49:42 crc kubenswrapper[4707]: I0129 03:49:42.977300 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:42 crc kubenswrapper[4707]: I0129 03:49:42.977678 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="ceilometer-central-agent" containerID="cri-o://5076de70b6044ddae46d7717f21d2469b924a0e3bb1eaa052aeaf068f5a156c0" gracePeriod=30 Jan 29 03:49:42 crc kubenswrapper[4707]: I0129 03:49:42.977855 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="proxy-httpd" containerID="cri-o://f4c4f53ac759b12452272de5a2372b7fd8039b269e9accde2d96864233c99d72" gracePeriod=30 Jan 29 03:49:42 crc kubenswrapper[4707]: I0129 03:49:42.977900 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="sg-core" containerID="cri-o://2a90035c9d58389248e6fbe14d378c8c435ab2fb3e7c7171988d37c8a2bbdc11" gracePeriod=30 Jan 29 03:49:42 crc kubenswrapper[4707]: I0129 03:49:42.977940 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="ceilometer-notification-agent" containerID="cri-o://c7a9c7c28622182a68e0347b5c8ed61fd9bb1d875920288d0d6eb9a6d6138614" gracePeriod=30 Jan 29 03:49:42 crc kubenswrapper[4707]: I0129 03:49:42.987833 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.213:3000/\": read tcp 10.217.0.2:40330->10.217.0.213:3000: read: connection reset by peer" Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.116640 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.462027 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9871425-141a-4770-9a04-117e63870be7" containerID="f4c4f53ac759b12452272de5a2372b7fd8039b269e9accde2d96864233c99d72" exitCode=0 Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.462470 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9871425-141a-4770-9a04-117e63870be7" containerID="2a90035c9d58389248e6fbe14d378c8c435ab2fb3e7c7171988d37c8a2bbdc11" exitCode=2 Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.462521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerDied","Data":"f4c4f53ac759b12452272de5a2372b7fd8039b269e9accde2d96864233c99d72"} Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.462567 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerDied","Data":"2a90035c9d58389248e6fbe14d378c8c435ab2fb3e7c7171988d37c8a2bbdc11"} Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.465478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" event={"ID":"c3cea440-5000-4c97-96ab-6436f2a69e02","Type":"ContainerStarted","Data":"ac6a46f30ac327783b2b2ce5e8c1294e97aee8f18f614a5b0fcc6c2e2b0f9762"} Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.465602 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.465923 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-log" containerID="cri-o://a44b03bc9e7ce865da34a49641ef9b98751a9f33344af3824f6e66ca0aef7cde" gracePeriod=30 Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.466731 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.466848 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-api" containerID="cri-o://3d252085479894cf4e0f02122957d366fd45a72a9dde23d18b3eb7b0e89a67f0" gracePeriod=30 Jan 29 03:49:43 crc kubenswrapper[4707]: I0129 03:49:43.498374 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" podStartSLOduration=3.498352519 podStartE2EDuration="3.498352519s" podCreationTimestamp="2026-01-29 03:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:43.490626248 +0000 UTC m=+1336.974855153" watchObservedRunningTime="2026-01-29 03:49:43.498352519 +0000 UTC m=+1336.982581424" Jan 29 03:49:44 crc kubenswrapper[4707]: I0129 03:49:44.494418 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9871425-141a-4770-9a04-117e63870be7" containerID="5076de70b6044ddae46d7717f21d2469b924a0e3bb1eaa052aeaf068f5a156c0" exitCode=0 Jan 29 03:49:44 crc kubenswrapper[4707]: I0129 03:49:44.494473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerDied","Data":"5076de70b6044ddae46d7717f21d2469b924a0e3bb1eaa052aeaf068f5a156c0"} Jan 29 03:49:44 crc kubenswrapper[4707]: I0129 03:49:44.497346 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerID="a44b03bc9e7ce865da34a49641ef9b98751a9f33344af3824f6e66ca0aef7cde" exitCode=143 Jan 29 03:49:44 crc kubenswrapper[4707]: I0129 03:49:44.497393 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1ed1f0e8-5f4f-428b-90ba-26a19000f166","Type":"ContainerDied","Data":"a44b03bc9e7ce865da34a49641ef9b98751a9f33344af3824f6e66ca0aef7cde"} Jan 29 03:49:44 crc kubenswrapper[4707]: I0129 03:49:44.968221 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.213:3000/\": dial tcp 10.217.0.213:3000: connect: connection refused" Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.529189 4707 generic.go:334] "Generic (PLEG): container finished" podID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerID="3d252085479894cf4e0f02122957d366fd45a72a9dde23d18b3eb7b0e89a67f0" exitCode=0 Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.529346 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1ed1f0e8-5f4f-428b-90ba-26a19000f166","Type":"ContainerDied","Data":"3d252085479894cf4e0f02122957d366fd45a72a9dde23d18b3eb7b0e89a67f0"} Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.701758 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.836105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-config-data\") pod \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.836979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6cnm\" (UniqueName: \"kubernetes.io/projected/1ed1f0e8-5f4f-428b-90ba-26a19000f166-kube-api-access-r6cnm\") pod \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.837899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed1f0e8-5f4f-428b-90ba-26a19000f166-logs\") pod \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.838000 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-combined-ca-bundle\") pod \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\" (UID: \"1ed1f0e8-5f4f-428b-90ba-26a19000f166\") " Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.838630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed1f0e8-5f4f-428b-90ba-26a19000f166-logs" (OuterVolumeSpecName: "logs") pod "1ed1f0e8-5f4f-428b-90ba-26a19000f166" (UID: "1ed1f0e8-5f4f-428b-90ba-26a19000f166"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.839232 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed1f0e8-5f4f-428b-90ba-26a19000f166-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.852313 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed1f0e8-5f4f-428b-90ba-26a19000f166-kube-api-access-r6cnm" (OuterVolumeSpecName: "kube-api-access-r6cnm") pod "1ed1f0e8-5f4f-428b-90ba-26a19000f166" (UID: "1ed1f0e8-5f4f-428b-90ba-26a19000f166"). InnerVolumeSpecName "kube-api-access-r6cnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.883363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-config-data" (OuterVolumeSpecName: "config-data") pod "1ed1f0e8-5f4f-428b-90ba-26a19000f166" (UID: "1ed1f0e8-5f4f-428b-90ba-26a19000f166"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.890478 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ed1f0e8-5f4f-428b-90ba-26a19000f166" (UID: "1ed1f0e8-5f4f-428b-90ba-26a19000f166"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.942893 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.942933 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6cnm\" (UniqueName: \"kubernetes.io/projected/1ed1f0e8-5f4f-428b-90ba-26a19000f166-kube-api-access-r6cnm\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:47 crc kubenswrapper[4707]: I0129 03:49:47.942945 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1f0e8-5f4f-428b-90ba-26a19000f166-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.116018 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.169425 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.558941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1ed1f0e8-5f4f-428b-90ba-26a19000f166","Type":"ContainerDied","Data":"783093f2c11430e6823e9127fd59e49327fb41a34ac1857371e350d27a6826a6"} Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.559383 4707 scope.go:117] "RemoveContainer" containerID="3d252085479894cf4e0f02122957d366fd45a72a9dde23d18b3eb7b0e89a67f0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.559666 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.573933 4707 generic.go:334] "Generic (PLEG): container finished" podID="f9871425-141a-4770-9a04-117e63870be7" containerID="c7a9c7c28622182a68e0347b5c8ed61fd9bb1d875920288d0d6eb9a6d6138614" exitCode=0 Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.576332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerDied","Data":"c7a9c7c28622182a68e0347b5c8ed61fd9bb1d875920288d0d6eb9a6d6138614"} Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.604965 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.610792 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.622115 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.635229 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:48 crc kubenswrapper[4707]: E0129 03:49:48.635759 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-api" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.635774 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-api" Jan 29 03:49:48 crc kubenswrapper[4707]: E0129 03:49:48.635804 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-log" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.635812 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-log" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.636043 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-log" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.636067 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" containerName="nova-api-api" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.637866 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.641782 4707 scope.go:117] "RemoveContainer" containerID="a44b03bc9e7ce865da34a49641ef9b98751a9f33344af3824f6e66ca0aef7cde" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.642943 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.655650 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.656307 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.672566 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.776661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fpbw\" (UniqueName: \"kubernetes.io/projected/60055891-f5cf-4003-87b1-116314371ba3-kube-api-access-8fpbw\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.776754 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60055891-f5cf-4003-87b1-116314371ba3-logs\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.776799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.776861 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.777008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-public-tls-certs\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.777201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-config-data\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.814117 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-zntq4"] Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.816014 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.819479 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.819729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.836640 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zntq4"] Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.879012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-config-data\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.881030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fpbw\" (UniqueName: \"kubernetes.io/projected/60055891-f5cf-4003-87b1-116314371ba3-kube-api-access-8fpbw\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.881192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60055891-f5cf-4003-87b1-116314371ba3-logs\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.881298 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.881458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.881500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-public-tls-certs\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.887421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60055891-f5cf-4003-87b1-116314371ba3-logs\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.891949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-public-tls-certs\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.904326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-config-data\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.907413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fpbw\" (UniqueName: \"kubernetes.io/projected/60055891-f5cf-4003-87b1-116314371ba3-kube-api-access-8fpbw\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.909940 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.916256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " pod="openstack/nova-api-0" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.983318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-scripts\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.983377 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-config-data\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.983663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lpdj\" (UniqueName: \"kubernetes.io/projected/90f02b4a-bf97-46a9-94a1-b60db6b01a33-kube-api-access-7lpdj\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.983725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:48 crc kubenswrapper[4707]: I0129 03:49:48.998806 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.087232 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-scripts\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.087286 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-config-data\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.087336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lpdj\" (UniqueName: \"kubernetes.io/projected/90f02b4a-bf97-46a9-94a1-b60db6b01a33-kube-api-access-7lpdj\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.087375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.092628 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-scripts\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.092868 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-config-data\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.093319 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.110967 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lpdj\" (UniqueName: \"kubernetes.io/projected/90f02b4a-bf97-46a9-94a1-b60db6b01a33-kube-api-access-7lpdj\") pod \"nova-cell1-cell-mapping-zntq4\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.145011 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.172424 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.256431 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed1f0e8-5f4f-428b-90ba-26a19000f166" path="/var/lib/kubelet/pods/1ed1f0e8-5f4f-428b-90ba-26a19000f166/volumes" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.291653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-ceilometer-tls-certs\") pod \"f9871425-141a-4770-9a04-117e63870be7\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.291958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-run-httpd\") pod \"f9871425-141a-4770-9a04-117e63870be7\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.292054 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-log-httpd\") pod \"f9871425-141a-4770-9a04-117e63870be7\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.292103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-config-data\") pod \"f9871425-141a-4770-9a04-117e63870be7\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.292144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx28c\" (UniqueName: \"kubernetes.io/projected/f9871425-141a-4770-9a04-117e63870be7-kube-api-access-rx28c\") pod \"f9871425-141a-4770-9a04-117e63870be7\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.293052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f9871425-141a-4770-9a04-117e63870be7" (UID: "f9871425-141a-4770-9a04-117e63870be7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.293442 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f9871425-141a-4770-9a04-117e63870be7" (UID: "f9871425-141a-4770-9a04-117e63870be7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.293514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-sg-core-conf-yaml\") pod \"f9871425-141a-4770-9a04-117e63870be7\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.294149 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-combined-ca-bundle\") pod \"f9871425-141a-4770-9a04-117e63870be7\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.294206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-scripts\") pod \"f9871425-141a-4770-9a04-117e63870be7\" (UID: \"f9871425-141a-4770-9a04-117e63870be7\") " Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.295069 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.295088 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9871425-141a-4770-9a04-117e63870be7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.301089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-scripts" (OuterVolumeSpecName: "scripts") pod "f9871425-141a-4770-9a04-117e63870be7" (UID: "f9871425-141a-4770-9a04-117e63870be7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.301986 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9871425-141a-4770-9a04-117e63870be7-kube-api-access-rx28c" (OuterVolumeSpecName: "kube-api-access-rx28c") pod "f9871425-141a-4770-9a04-117e63870be7" (UID: "f9871425-141a-4770-9a04-117e63870be7"). InnerVolumeSpecName "kube-api-access-rx28c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.371782 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f9871425-141a-4770-9a04-117e63870be7" (UID: "f9871425-141a-4770-9a04-117e63870be7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.394223 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f9871425-141a-4770-9a04-117e63870be7" (UID: "f9871425-141a-4770-9a04-117e63870be7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.396709 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.396736 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx28c\" (UniqueName: \"kubernetes.io/projected/f9871425-141a-4770-9a04-117e63870be7-kube-api-access-rx28c\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.396748 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.396759 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.436228 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9871425-141a-4770-9a04-117e63870be7" (UID: "f9871425-141a-4770-9a04-117e63870be7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.438119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-config-data" (OuterVolumeSpecName: "config-data") pod "f9871425-141a-4770-9a04-117e63870be7" (UID: "f9871425-141a-4770-9a04-117e63870be7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:49 crc kubenswrapper[4707]: W0129 03:49:49.485901 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60055891_f5cf_4003_87b1_116314371ba3.slice/crio-96bcbf0c3e6e4fc01661627c22655da0807bdc3563836cd46c3112ce051d4e3c WatchSource:0}: Error finding container 96bcbf0c3e6e4fc01661627c22655da0807bdc3563836cd46c3112ce051d4e3c: Status 404 returned error can't find the container with id 96bcbf0c3e6e4fc01661627c22655da0807bdc3563836cd46c3112ce051d4e3c Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.487198 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.497912 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.497936 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9871425-141a-4770-9a04-117e63870be7-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.589476 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9871425-141a-4770-9a04-117e63870be7","Type":"ContainerDied","Data":"38f26552e2e3243ab7a6088945d74242b62ce730bc39ed5d6e3aad0f06cf1191"} Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.589587 4707 scope.go:117] "RemoveContainer" containerID="f4c4f53ac759b12452272de5a2372b7fd8039b269e9accde2d96864233c99d72" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.589526 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.591093 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60055891-f5cf-4003-87b1-116314371ba3","Type":"ContainerStarted","Data":"96bcbf0c3e6e4fc01661627c22655da0807bdc3563836cd46c3112ce051d4e3c"} Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.629056 4707 scope.go:117] "RemoveContainer" containerID="2a90035c9d58389248e6fbe14d378c8c435ab2fb3e7c7171988d37c8a2bbdc11" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.637498 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.656982 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.672136 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:49 crc kubenswrapper[4707]: E0129 03:49:49.672644 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="proxy-httpd" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.672663 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="proxy-httpd" Jan 29 03:49:49 crc kubenswrapper[4707]: E0129 03:49:49.672690 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="ceilometer-central-agent" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.672699 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="ceilometer-central-agent" Jan 29 03:49:49 crc kubenswrapper[4707]: E0129 03:49:49.672714 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="sg-core" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.672720 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="sg-core" Jan 29 03:49:49 crc kubenswrapper[4707]: E0129 03:49:49.672742 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="ceilometer-notification-agent" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.672748 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="ceilometer-notification-agent" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.672916 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="sg-core" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.672931 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="proxy-httpd" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.672943 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="ceilometer-notification-agent" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.672955 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9871425-141a-4770-9a04-117e63870be7" containerName="ceilometer-central-agent" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.674759 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.689641 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.689650 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.690034 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.693682 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.702577 4707 scope.go:117] "RemoveContainer" containerID="c7a9c7c28622182a68e0347b5c8ed61fd9bb1d875920288d0d6eb9a6d6138614" Jan 29 03:49:49 crc kubenswrapper[4707]: W0129 03:49:49.730336 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90f02b4a_bf97_46a9_94a1_b60db6b01a33.slice/crio-92a08920cf03945b6a4a3bf2f7aba93c67e92df073b40e97995c1a0fac19992f WatchSource:0}: Error finding container 92a08920cf03945b6a4a3bf2f7aba93c67e92df073b40e97995c1a0fac19992f: Status 404 returned error can't find the container with id 92a08920cf03945b6a4a3bf2f7aba93c67e92df073b40e97995c1a0fac19992f Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.739446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zntq4"] Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.742084 4707 scope.go:117] "RemoveContainer" containerID="5076de70b6044ddae46d7717f21d2469b924a0e3bb1eaa052aeaf068f5a156c0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.807404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-run-httpd\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.807473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-scripts\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.807514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-config-data\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.807652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.807676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492hd\" (UniqueName: \"kubernetes.io/projected/7470b455-8eb5-43ed-85cd-ad132974a76e-kube-api-access-492hd\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.807746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.807984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-log-httpd\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.808120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.910080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-scripts\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.910144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-config-data\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.910256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.910283 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492hd\" (UniqueName: \"kubernetes.io/projected/7470b455-8eb5-43ed-85cd-ad132974a76e-kube-api-access-492hd\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.910324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.910347 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-log-httpd\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.910368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.910407 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-run-httpd\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.910876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-run-httpd\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.911219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-log-httpd\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.917105 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.918820 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.919615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-scripts\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.929432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492hd\" (UniqueName: \"kubernetes.io/projected/7470b455-8eb5-43ed-85cd-ad132974a76e-kube-api-access-492hd\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.932710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:49 crc kubenswrapper[4707]: I0129 03:49:49.933299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-config-data\") pod \"ceilometer-0\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " pod="openstack/ceilometer-0" Jan 29 03:49:50 crc kubenswrapper[4707]: I0129 03:49:50.004068 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 03:49:50 crc kubenswrapper[4707]: I0129 03:49:50.610625 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zntq4" event={"ID":"90f02b4a-bf97-46a9-94a1-b60db6b01a33","Type":"ContainerStarted","Data":"bcc5990f8297b84dc1ce717f93a19bc0397cd8a458ae463db68bf4c18fea9903"} Jan 29 03:49:50 crc kubenswrapper[4707]: I0129 03:49:50.611629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zntq4" event={"ID":"90f02b4a-bf97-46a9-94a1-b60db6b01a33","Type":"ContainerStarted","Data":"92a08920cf03945b6a4a3bf2f7aba93c67e92df073b40e97995c1a0fac19992f"} Jan 29 03:49:50 crc kubenswrapper[4707]: I0129 03:49:50.612057 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60055891-f5cf-4003-87b1-116314371ba3","Type":"ContainerStarted","Data":"586b7cbfab597f6d9d8cd87d4a13627b6f1d3b49cf67560a017114f26a6fe3e7"} Jan 29 03:49:50 crc kubenswrapper[4707]: I0129 03:49:50.612081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60055891-f5cf-4003-87b1-116314371ba3","Type":"ContainerStarted","Data":"20ae22004480da49b61fb14dac00e4fe5a6c6c0d19c2d0311cf40f94ca7e9d36"} Jan 29 03:49:50 crc kubenswrapper[4707]: I0129 03:49:50.637821 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-zntq4" podStartSLOduration=2.637797173 podStartE2EDuration="2.637797173s" podCreationTimestamp="2026-01-29 03:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:50.629172867 +0000 UTC m=+1344.113401772" watchObservedRunningTime="2026-01-29 03:49:50.637797173 +0000 UTC m=+1344.122026078" Jan 29 03:49:50 crc kubenswrapper[4707]: I0129 03:49:50.655788 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.655747846 podStartE2EDuration="2.655747846s" podCreationTimestamp="2026-01-29 03:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:49:50.651567656 +0000 UTC m=+1344.135796591" watchObservedRunningTime="2026-01-29 03:49:50.655747846 +0000 UTC m=+1344.139976791" Jan 29 03:49:51 crc kubenswrapper[4707]: I0129 03:49:51.200968 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:49:51 crc kubenswrapper[4707]: I0129 03:49:51.265424 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9871425-141a-4770-9a04-117e63870be7" path="/var/lib/kubelet/pods/f9871425-141a-4770-9a04-117e63870be7/volumes" Jan 29 03:49:51 crc kubenswrapper[4707]: W0129 03:49:51.291708 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7470b455_8eb5_43ed_85cd_ad132974a76e.slice/crio-5a7a7ea86c9065ec8deda5836beaffecd06d8afc4a2c61ebbaf6f7288ccacf10 WatchSource:0}: Error finding container 5a7a7ea86c9065ec8deda5836beaffecd06d8afc4a2c61ebbaf6f7288ccacf10: Status 404 returned error can't find the container with id 5a7a7ea86c9065ec8deda5836beaffecd06d8afc4a2c61ebbaf6f7288ccacf10 Jan 29 03:49:51 crc kubenswrapper[4707]: I0129 03:49:51.304835 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 03:49:51 crc kubenswrapper[4707]: I0129 03:49:51.327320 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-kkkwk"] Jan 29 03:49:51 crc kubenswrapper[4707]: I0129 03:49:51.327668 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" podUID="74e85360-0f32-4475-a5e8-9af4d4085841" containerName="dnsmasq-dns" containerID="cri-o://93a155f7f55325167c7cd607feca85760d572069f39cc7e0489c899961c25798" gracePeriod=10 Jan 29 03:49:51 crc kubenswrapper[4707]: I0129 03:49:51.630008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerStarted","Data":"5a7a7ea86c9065ec8deda5836beaffecd06d8afc4a2c61ebbaf6f7288ccacf10"} Jan 29 03:49:51 crc kubenswrapper[4707]: I0129 03:49:51.634032 4707 generic.go:334] "Generic (PLEG): container finished" podID="74e85360-0f32-4475-a5e8-9af4d4085841" containerID="93a155f7f55325167c7cd607feca85760d572069f39cc7e0489c899961c25798" exitCode=0 Jan 29 03:49:51 crc kubenswrapper[4707]: I0129 03:49:51.634062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" event={"ID":"74e85360-0f32-4475-a5e8-9af4d4085841","Type":"ContainerDied","Data":"93a155f7f55325167c7cd607feca85760d572069f39cc7e0489c899961c25798"} Jan 29 03:49:51 crc kubenswrapper[4707]: I0129 03:49:51.904485 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.005212 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-config\") pod \"74e85360-0f32-4475-a5e8-9af4d4085841\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.005326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-swift-storage-0\") pod \"74e85360-0f32-4475-a5e8-9af4d4085841\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.005394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgspp\" (UniqueName: \"kubernetes.io/projected/74e85360-0f32-4475-a5e8-9af4d4085841-kube-api-access-qgspp\") pod \"74e85360-0f32-4475-a5e8-9af4d4085841\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.005558 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-nb\") pod \"74e85360-0f32-4475-a5e8-9af4d4085841\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.005673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-sb\") pod \"74e85360-0f32-4475-a5e8-9af4d4085841\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.005702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-svc\") pod \"74e85360-0f32-4475-a5e8-9af4d4085841\" (UID: \"74e85360-0f32-4475-a5e8-9af4d4085841\") " Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.014765 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e85360-0f32-4475-a5e8-9af4d4085841-kube-api-access-qgspp" (OuterVolumeSpecName: "kube-api-access-qgspp") pod "74e85360-0f32-4475-a5e8-9af4d4085841" (UID: "74e85360-0f32-4475-a5e8-9af4d4085841"). InnerVolumeSpecName "kube-api-access-qgspp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.051749 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74e85360-0f32-4475-a5e8-9af4d4085841" (UID: "74e85360-0f32-4475-a5e8-9af4d4085841"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.063720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-config" (OuterVolumeSpecName: "config") pod "74e85360-0f32-4475-a5e8-9af4d4085841" (UID: "74e85360-0f32-4475-a5e8-9af4d4085841"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.067153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74e85360-0f32-4475-a5e8-9af4d4085841" (UID: "74e85360-0f32-4475-a5e8-9af4d4085841"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.072597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74e85360-0f32-4475-a5e8-9af4d4085841" (UID: "74e85360-0f32-4475-a5e8-9af4d4085841"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.076289 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74e85360-0f32-4475-a5e8-9af4d4085841" (UID: "74e85360-0f32-4475-a5e8-9af4d4085841"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.108838 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.108875 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.108886 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.108895 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.108906 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgspp\" (UniqueName: \"kubernetes.io/projected/74e85360-0f32-4475-a5e8-9af4d4085841-kube-api-access-qgspp\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.108917 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74e85360-0f32-4475-a5e8-9af4d4085841-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.675609 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" event={"ID":"74e85360-0f32-4475-a5e8-9af4d4085841","Type":"ContainerDied","Data":"1adf593588d7b0e2bc26842217e19bfa2bcdf92a6341071f0860f3332fa77b07"} Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.676285 4707 scope.go:117] "RemoveContainer" containerID="93a155f7f55325167c7cd607feca85760d572069f39cc7e0489c899961c25798" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.676581 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-kkkwk" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.687859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerStarted","Data":"4d0e2d9b1eec9cc61f9f0651e523a9e4d75212d47403824ecd3e4321067f63bc"} Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.753048 4707 scope.go:117] "RemoveContainer" containerID="077108a8d9306f14acd08aeef68427d2b33186b6664b47ae784cdb80e983578b" Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.778335 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-kkkwk"] Jan 29 03:49:52 crc kubenswrapper[4707]: I0129 03:49:52.788703 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-kkkwk"] Jan 29 03:49:53 crc kubenswrapper[4707]: I0129 03:49:53.262527 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e85360-0f32-4475-a5e8-9af4d4085841" path="/var/lib/kubelet/pods/74e85360-0f32-4475-a5e8-9af4d4085841/volumes" Jan 29 03:49:53 crc kubenswrapper[4707]: I0129 03:49:53.703434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerStarted","Data":"4fcf6d6105ffe2ba61f0f3d63506f3c694c6c6b4f7d0dc7f810006808e80f883"} Jan 29 03:49:53 crc kubenswrapper[4707]: I0129 03:49:53.704184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerStarted","Data":"e6f50d4dacd63a73e289ee94fe2585ec411ded08580d9d073e076fae45c8692b"} Jan 29 03:49:55 crc kubenswrapper[4707]: I0129 03:49:55.724400 4707 generic.go:334] "Generic (PLEG): container finished" podID="90f02b4a-bf97-46a9-94a1-b60db6b01a33" containerID="bcc5990f8297b84dc1ce717f93a19bc0397cd8a458ae463db68bf4c18fea9903" exitCode=0 Jan 29 03:49:55 crc kubenswrapper[4707]: I0129 03:49:55.724510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zntq4" event={"ID":"90f02b4a-bf97-46a9-94a1-b60db6b01a33","Type":"ContainerDied","Data":"bcc5990f8297b84dc1ce717f93a19bc0397cd8a458ae463db68bf4c18fea9903"} Jan 29 03:49:56 crc kubenswrapper[4707]: I0129 03:49:56.752450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerStarted","Data":"3365cdbdc6e7a511395c02dd233d4de53a208ff117c4d707c42d3390af860ec2"} Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.092957 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.112731 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.339498843 podStartE2EDuration="8.11270803s" podCreationTimestamp="2026-01-29 03:49:49 +0000 UTC" firstStartedPulling="2026-01-29 03:49:51.296932456 +0000 UTC m=+1344.781161371" lastFinishedPulling="2026-01-29 03:49:56.070141643 +0000 UTC m=+1349.554370558" observedRunningTime="2026-01-29 03:49:56.795482156 +0000 UTC m=+1350.279711071" watchObservedRunningTime="2026-01-29 03:49:57.11270803 +0000 UTC m=+1350.596936935" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.231845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-config-data\") pod \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.232042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-combined-ca-bundle\") pod \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.232098 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-scripts\") pod \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.232199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lpdj\" (UniqueName: \"kubernetes.io/projected/90f02b4a-bf97-46a9-94a1-b60db6b01a33-kube-api-access-7lpdj\") pod \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\" (UID: \"90f02b4a-bf97-46a9-94a1-b60db6b01a33\") " Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.238188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-scripts" (OuterVolumeSpecName: "scripts") pod "90f02b4a-bf97-46a9-94a1-b60db6b01a33" (UID: "90f02b4a-bf97-46a9-94a1-b60db6b01a33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.238472 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f02b4a-bf97-46a9-94a1-b60db6b01a33-kube-api-access-7lpdj" (OuterVolumeSpecName: "kube-api-access-7lpdj") pod "90f02b4a-bf97-46a9-94a1-b60db6b01a33" (UID: "90f02b4a-bf97-46a9-94a1-b60db6b01a33"). InnerVolumeSpecName "kube-api-access-7lpdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.260691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90f02b4a-bf97-46a9-94a1-b60db6b01a33" (UID: "90f02b4a-bf97-46a9-94a1-b60db6b01a33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.275178 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-config-data" (OuterVolumeSpecName: "config-data") pod "90f02b4a-bf97-46a9-94a1-b60db6b01a33" (UID: "90f02b4a-bf97-46a9-94a1-b60db6b01a33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.334661 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lpdj\" (UniqueName: \"kubernetes.io/projected/90f02b4a-bf97-46a9-94a1-b60db6b01a33-kube-api-access-7lpdj\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.334709 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.334723 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.334732 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f02b4a-bf97-46a9-94a1-b60db6b01a33-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.779095 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zntq4" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.781077 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zntq4" event={"ID":"90f02b4a-bf97-46a9-94a1-b60db6b01a33","Type":"ContainerDied","Data":"92a08920cf03945b6a4a3bf2f7aba93c67e92df073b40e97995c1a0fac19992f"} Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.781133 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92a08920cf03945b6a4a3bf2f7aba93c67e92df073b40e97995c1a0fac19992f" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.781165 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.961300 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.961882 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="678597fa-695b-4d44-aa1c-12127cac6804" containerName="nova-scheduler-scheduler" containerID="cri-o://ed65b8105d3ec214fb32dc626a13214ac153b2773c9490eff8ebf854edbb75d5" gracePeriod=30 Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.982602 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.982906 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60055891-f5cf-4003-87b1-116314371ba3" containerName="nova-api-log" containerID="cri-o://20ae22004480da49b61fb14dac00e4fe5a6c6c0d19c2d0311cf40f94ca7e9d36" gracePeriod=30 Jan 29 03:49:57 crc kubenswrapper[4707]: I0129 03:49:57.983008 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60055891-f5cf-4003-87b1-116314371ba3" containerName="nova-api-api" containerID="cri-o://586b7cbfab597f6d9d8cd87d4a13627b6f1d3b49cf67560a017114f26a6fe3e7" gracePeriod=30 Jan 29 03:49:58 crc kubenswrapper[4707]: I0129 03:49:58.015159 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:49:58 crc kubenswrapper[4707]: I0129 03:49:58.015414 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-log" containerID="cri-o://3bbd1c6ecf561f8ae2565973ac9dc34d83ec9e52f04b968f70bb36b68ea9b7e8" gracePeriod=30 Jan 29 03:49:58 crc kubenswrapper[4707]: I0129 03:49:58.015930 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-metadata" containerID="cri-o://ec94449a991d649c47094022ec2cbc29578fe96f14d79940db68fa092ec2d267" gracePeriod=30 Jan 29 03:49:58 crc kubenswrapper[4707]: I0129 03:49:58.796315 4707 generic.go:334] "Generic (PLEG): container finished" podID="60055891-f5cf-4003-87b1-116314371ba3" containerID="586b7cbfab597f6d9d8cd87d4a13627b6f1d3b49cf67560a017114f26a6fe3e7" exitCode=0 Jan 29 03:49:58 crc kubenswrapper[4707]: I0129 03:49:58.796646 4707 generic.go:334] "Generic (PLEG): container finished" podID="60055891-f5cf-4003-87b1-116314371ba3" containerID="20ae22004480da49b61fb14dac00e4fe5a6c6c0d19c2d0311cf40f94ca7e9d36" exitCode=143 Jan 29 03:49:58 crc kubenswrapper[4707]: I0129 03:49:58.796660 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60055891-f5cf-4003-87b1-116314371ba3","Type":"ContainerDied","Data":"586b7cbfab597f6d9d8cd87d4a13627b6f1d3b49cf67560a017114f26a6fe3e7"} Jan 29 03:49:58 crc kubenswrapper[4707]: I0129 03:49:58.796710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60055891-f5cf-4003-87b1-116314371ba3","Type":"ContainerDied","Data":"20ae22004480da49b61fb14dac00e4fe5a6c6c0d19c2d0311cf40f94ca7e9d36"} Jan 29 03:49:58 crc kubenswrapper[4707]: I0129 03:49:58.800138 4707 generic.go:334] "Generic (PLEG): container finished" podID="1d07ad17-c85d-4890-9636-b02564c1c482" containerID="3bbd1c6ecf561f8ae2565973ac9dc34d83ec9e52f04b968f70bb36b68ea9b7e8" exitCode=143 Jan 29 03:49:58 crc kubenswrapper[4707]: I0129 03:49:58.800161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d07ad17-c85d-4890-9636-b02564c1c482","Type":"ContainerDied","Data":"3bbd1c6ecf561f8ae2565973ac9dc34d83ec9e52f04b968f70bb36b68ea9b7e8"} Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.057763 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.183586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60055891-f5cf-4003-87b1-116314371ba3-logs\") pod \"60055891-f5cf-4003-87b1-116314371ba3\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.183928 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60055891-f5cf-4003-87b1-116314371ba3-logs" (OuterVolumeSpecName: "logs") pod "60055891-f5cf-4003-87b1-116314371ba3" (UID: "60055891-f5cf-4003-87b1-116314371ba3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.184067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-config-data\") pod \"60055891-f5cf-4003-87b1-116314371ba3\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.184135 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fpbw\" (UniqueName: \"kubernetes.io/projected/60055891-f5cf-4003-87b1-116314371ba3-kube-api-access-8fpbw\") pod \"60055891-f5cf-4003-87b1-116314371ba3\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.184183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-combined-ca-bundle\") pod \"60055891-f5cf-4003-87b1-116314371ba3\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.184238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-internal-tls-certs\") pod \"60055891-f5cf-4003-87b1-116314371ba3\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.184300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-public-tls-certs\") pod \"60055891-f5cf-4003-87b1-116314371ba3\" (UID: \"60055891-f5cf-4003-87b1-116314371ba3\") " Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.184712 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60055891-f5cf-4003-87b1-116314371ba3-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.193197 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60055891-f5cf-4003-87b1-116314371ba3-kube-api-access-8fpbw" (OuterVolumeSpecName: "kube-api-access-8fpbw") pod "60055891-f5cf-4003-87b1-116314371ba3" (UID: "60055891-f5cf-4003-87b1-116314371ba3"). InnerVolumeSpecName "kube-api-access-8fpbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.212967 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60055891-f5cf-4003-87b1-116314371ba3" (UID: "60055891-f5cf-4003-87b1-116314371ba3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.215209 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-config-data" (OuterVolumeSpecName: "config-data") pod "60055891-f5cf-4003-87b1-116314371ba3" (UID: "60055891-f5cf-4003-87b1-116314371ba3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.243917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "60055891-f5cf-4003-87b1-116314371ba3" (UID: "60055891-f5cf-4003-87b1-116314371ba3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.247504 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "60055891-f5cf-4003-87b1-116314371ba3" (UID: "60055891-f5cf-4003-87b1-116314371ba3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.286932 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.287213 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fpbw\" (UniqueName: \"kubernetes.io/projected/60055891-f5cf-4003-87b1-116314371ba3-kube-api-access-8fpbw\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.287304 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.287372 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.287443 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60055891-f5cf-4003-87b1-116314371ba3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.815627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60055891-f5cf-4003-87b1-116314371ba3","Type":"ContainerDied","Data":"96bcbf0c3e6e4fc01661627c22655da0807bdc3563836cd46c3112ce051d4e3c"} Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.815666 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.815691 4707 scope.go:117] "RemoveContainer" containerID="586b7cbfab597f6d9d8cd87d4a13627b6f1d3b49cf67560a017114f26a6fe3e7" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.821111 4707 generic.go:334] "Generic (PLEG): container finished" podID="678597fa-695b-4d44-aa1c-12127cac6804" containerID="ed65b8105d3ec214fb32dc626a13214ac153b2773c9490eff8ebf854edbb75d5" exitCode=0 Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.821174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"678597fa-695b-4d44-aa1c-12127cac6804","Type":"ContainerDied","Data":"ed65b8105d3ec214fb32dc626a13214ac153b2773c9490eff8ebf854edbb75d5"} Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.849618 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.868625 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.870061 4707 scope.go:117] "RemoveContainer" containerID="20ae22004480da49b61fb14dac00e4fe5a6c6c0d19c2d0311cf40f94ca7e9d36" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.889673 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:59 crc kubenswrapper[4707]: E0129 03:49:59.890741 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f02b4a-bf97-46a9-94a1-b60db6b01a33" containerName="nova-manage" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.890760 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f02b4a-bf97-46a9-94a1-b60db6b01a33" containerName="nova-manage" Jan 29 03:49:59 crc kubenswrapper[4707]: E0129 03:49:59.890787 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e85360-0f32-4475-a5e8-9af4d4085841" containerName="init" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.890797 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e85360-0f32-4475-a5e8-9af4d4085841" containerName="init" Jan 29 03:49:59 crc kubenswrapper[4707]: E0129 03:49:59.890827 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e85360-0f32-4475-a5e8-9af4d4085841" containerName="dnsmasq-dns" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.890834 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e85360-0f32-4475-a5e8-9af4d4085841" containerName="dnsmasq-dns" Jan 29 03:49:59 crc kubenswrapper[4707]: E0129 03:49:59.890848 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60055891-f5cf-4003-87b1-116314371ba3" containerName="nova-api-api" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.890856 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="60055891-f5cf-4003-87b1-116314371ba3" containerName="nova-api-api" Jan 29 03:49:59 crc kubenswrapper[4707]: E0129 03:49:59.890893 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60055891-f5cf-4003-87b1-116314371ba3" containerName="nova-api-log" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.890900 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="60055891-f5cf-4003-87b1-116314371ba3" containerName="nova-api-log" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.891133 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="60055891-f5cf-4003-87b1-116314371ba3" containerName="nova-api-log" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.891152 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="60055891-f5cf-4003-87b1-116314371ba3" containerName="nova-api-api" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.891169 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f02b4a-bf97-46a9-94a1-b60db6b01a33" containerName="nova-manage" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.891178 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e85360-0f32-4475-a5e8-9af4d4085841" containerName="dnsmasq-dns" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.892625 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.896253 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.896420 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.897130 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.923787 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:49:59 crc kubenswrapper[4707]: I0129 03:49:59.975122 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.005473 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4vl\" (UniqueName: \"kubernetes.io/projected/ce279671-2df3-4af7-a6bb-2ac9fdc048da-kube-api-access-nl4vl\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.005569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.005602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.005657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-config-data\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.005852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.005912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce279671-2df3-4af7-a6bb-2ac9fdc048da-logs\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.106760 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-combined-ca-bundle\") pod \"678597fa-695b-4d44-aa1c-12127cac6804\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.106821 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-config-data\") pod \"678597fa-695b-4d44-aa1c-12127cac6804\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.106846 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6zrq\" (UniqueName: \"kubernetes.io/projected/678597fa-695b-4d44-aa1c-12127cac6804-kube-api-access-j6zrq\") pod \"678597fa-695b-4d44-aa1c-12127cac6804\" (UID: \"678597fa-695b-4d44-aa1c-12127cac6804\") " Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.107364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-config-data\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.107426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.107480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce279671-2df3-4af7-a6bb-2ac9fdc048da-logs\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.107513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4vl\" (UniqueName: \"kubernetes.io/projected/ce279671-2df3-4af7-a6bb-2ac9fdc048da-kube-api-access-nl4vl\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.107565 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.107598 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.108066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce279671-2df3-4af7-a6bb-2ac9fdc048da-logs\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.114949 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678597fa-695b-4d44-aa1c-12127cac6804-kube-api-access-j6zrq" (OuterVolumeSpecName: "kube-api-access-j6zrq") pod "678597fa-695b-4d44-aa1c-12127cac6804" (UID: "678597fa-695b-4d44-aa1c-12127cac6804"). InnerVolumeSpecName "kube-api-access-j6zrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.117667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.119175 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.120204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-config-data\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.121568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce279671-2df3-4af7-a6bb-2ac9fdc048da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.139192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4vl\" (UniqueName: \"kubernetes.io/projected/ce279671-2df3-4af7-a6bb-2ac9fdc048da-kube-api-access-nl4vl\") pod \"nova-api-0\" (UID: \"ce279671-2df3-4af7-a6bb-2ac9fdc048da\") " pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.154217 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "678597fa-695b-4d44-aa1c-12127cac6804" (UID: "678597fa-695b-4d44-aa1c-12127cac6804"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.153266 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-config-data" (OuterVolumeSpecName: "config-data") pod "678597fa-695b-4d44-aa1c-12127cac6804" (UID: "678597fa-695b-4d44-aa1c-12127cac6804"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.209737 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.209785 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/678597fa-695b-4d44-aa1c-12127cac6804-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.209800 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6zrq\" (UniqueName: \"kubernetes.io/projected/678597fa-695b-4d44-aa1c-12127cac6804-kube-api-access-j6zrq\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.272409 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.774825 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.835116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"678597fa-695b-4d44-aa1c-12127cac6804","Type":"ContainerDied","Data":"c24618c3b94339ac12dd4dfa36aafd2a10f2dfbc07615d2a0fd577dd8c12a976"} Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.835280 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.835618 4707 scope.go:117] "RemoveContainer" containerID="ed65b8105d3ec214fb32dc626a13214ac153b2773c9490eff8ebf854edbb75d5" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.842850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce279671-2df3-4af7-a6bb-2ac9fdc048da","Type":"ContainerStarted","Data":"10013d5007244e5d7746db9878b935bbf22c917f561371774acc223171bce68e"} Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.937999 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.952323 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.961678 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:50:00 crc kubenswrapper[4707]: E0129 03:50:00.962455 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678597fa-695b-4d44-aa1c-12127cac6804" containerName="nova-scheduler-scheduler" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.962483 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="678597fa-695b-4d44-aa1c-12127cac6804" containerName="nova-scheduler-scheduler" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.962823 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="678597fa-695b-4d44-aa1c-12127cac6804" containerName="nova-scheduler-scheduler" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.963900 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.967288 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:50:00 crc kubenswrapper[4707]: I0129 03:50:00.968217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.034501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7164be40-3659-450f-885f-db200baa5ed2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7164be40-3659-450f-885f-db200baa5ed2\") " pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.034778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkx7\" (UniqueName: \"kubernetes.io/projected/7164be40-3659-450f-885f-db200baa5ed2-kube-api-access-5pkx7\") pod \"nova-scheduler-0\" (UID: \"7164be40-3659-450f-885f-db200baa5ed2\") " pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.035185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7164be40-3659-450f-885f-db200baa5ed2-config-data\") pod \"nova-scheduler-0\" (UID: \"7164be40-3659-450f-885f-db200baa5ed2\") " pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.137728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7164be40-3659-450f-885f-db200baa5ed2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7164be40-3659-450f-885f-db200baa5ed2\") " pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.137807 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkx7\" (UniqueName: \"kubernetes.io/projected/7164be40-3659-450f-885f-db200baa5ed2-kube-api-access-5pkx7\") pod \"nova-scheduler-0\" (UID: \"7164be40-3659-450f-885f-db200baa5ed2\") " pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.137889 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7164be40-3659-450f-885f-db200baa5ed2-config-data\") pod \"nova-scheduler-0\" (UID: \"7164be40-3659-450f-885f-db200baa5ed2\") " pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.142718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7164be40-3659-450f-885f-db200baa5ed2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7164be40-3659-450f-885f-db200baa5ed2\") " pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.143824 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7164be40-3659-450f-885f-db200baa5ed2-config-data\") pod \"nova-scheduler-0\" (UID: \"7164be40-3659-450f-885f-db200baa5ed2\") " pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.156391 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkx7\" (UniqueName: \"kubernetes.io/projected/7164be40-3659-450f-885f-db200baa5ed2-kube-api-access-5pkx7\") pod \"nova-scheduler-0\" (UID: \"7164be40-3659-450f-885f-db200baa5ed2\") " pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.255162 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60055891-f5cf-4003-87b1-116314371ba3" path="/var/lib/kubelet/pods/60055891-f5cf-4003-87b1-116314371ba3/volumes" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.255875 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678597fa-695b-4d44-aa1c-12127cac6804" path="/var/lib/kubelet/pods/678597fa-695b-4d44-aa1c-12127cac6804/volumes" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.308142 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.435267 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": read tcp 10.217.0.2:47798->10.217.0.215:8775: read: connection reset by peer" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.435667 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": read tcp 10.217.0.2:47796->10.217.0.215:8775: read: connection reset by peer" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.840142 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.863650 4707 generic.go:334] "Generic (PLEG): container finished" podID="1d07ad17-c85d-4890-9636-b02564c1c482" containerID="ec94449a991d649c47094022ec2cbc29578fe96f14d79940db68fa092ec2d267" exitCode=0 Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.865650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d07ad17-c85d-4890-9636-b02564c1c482","Type":"ContainerDied","Data":"ec94449a991d649c47094022ec2cbc29578fe96f14d79940db68fa092ec2d267"} Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.865726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d07ad17-c85d-4890-9636-b02564c1c482","Type":"ContainerDied","Data":"e80767a19ee00e9afb99ceff65a262873c3633974b90e7602c15d72b960e6553"} Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.865747 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e80767a19ee00e9afb99ceff65a262873c3633974b90e7602c15d72b960e6553" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.870473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce279671-2df3-4af7-a6bb-2ac9fdc048da","Type":"ContainerStarted","Data":"f2ab2bc2a0297dd7872ae8bcf0d9f3ac98b1280a5198fbffc3713cf896e2f1ae"} Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.870575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce279671-2df3-4af7-a6bb-2ac9fdc048da","Type":"ContainerStarted","Data":"4512fde2502958d2b9ddb1d63781de834164f8e4cebe0a98ad9a19a2633431b2"} Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.904297 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.904272231 podStartE2EDuration="2.904272231s" podCreationTimestamp="2026-01-29 03:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:50:01.891354902 +0000 UTC m=+1355.375583807" watchObservedRunningTime="2026-01-29 03:50:01.904272231 +0000 UTC m=+1355.388501136" Jan 29 03:50:01 crc kubenswrapper[4707]: I0129 03:50:01.963789 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.055702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d07ad17-c85d-4890-9636-b02564c1c482-logs\") pod \"1d07ad17-c85d-4890-9636-b02564c1c482\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.055892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-nova-metadata-tls-certs\") pod \"1d07ad17-c85d-4890-9636-b02564c1c482\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.056761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d07ad17-c85d-4890-9636-b02564c1c482-logs" (OuterVolumeSpecName: "logs") pod "1d07ad17-c85d-4890-9636-b02564c1c482" (UID: "1d07ad17-c85d-4890-9636-b02564c1c482"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.058203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-combined-ca-bundle\") pod \"1d07ad17-c85d-4890-9636-b02564c1c482\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.058319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-config-data\") pod \"1d07ad17-c85d-4890-9636-b02564c1c482\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.058428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skkzj\" (UniqueName: \"kubernetes.io/projected/1d07ad17-c85d-4890-9636-b02564c1c482-kube-api-access-skkzj\") pod \"1d07ad17-c85d-4890-9636-b02564c1c482\" (UID: \"1d07ad17-c85d-4890-9636-b02564c1c482\") " Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.059656 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d07ad17-c85d-4890-9636-b02564c1c482-logs\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.066729 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d07ad17-c85d-4890-9636-b02564c1c482-kube-api-access-skkzj" (OuterVolumeSpecName: "kube-api-access-skkzj") pod "1d07ad17-c85d-4890-9636-b02564c1c482" (UID: "1d07ad17-c85d-4890-9636-b02564c1c482"). InnerVolumeSpecName "kube-api-access-skkzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.092581 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d07ad17-c85d-4890-9636-b02564c1c482" (UID: "1d07ad17-c85d-4890-9636-b02564c1c482"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.146103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-config-data" (OuterVolumeSpecName: "config-data") pod "1d07ad17-c85d-4890-9636-b02564c1c482" (UID: "1d07ad17-c85d-4890-9636-b02564c1c482"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.162201 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.162234 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.162243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skkzj\" (UniqueName: \"kubernetes.io/projected/1d07ad17-c85d-4890-9636-b02564c1c482-kube-api-access-skkzj\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.166938 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1d07ad17-c85d-4890-9636-b02564c1c482" (UID: "1d07ad17-c85d-4890-9636-b02564c1c482"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.265037 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d07ad17-c85d-4890-9636-b02564c1c482-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.887404 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.889445 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7164be40-3659-450f-885f-db200baa5ed2","Type":"ContainerStarted","Data":"2f99390bcccb82f9564a88fb76cbf1307527ac5b98639f3a589b5a0d25572c5f"} Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.889499 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7164be40-3659-450f-885f-db200baa5ed2","Type":"ContainerStarted","Data":"bbebcc52876053f10514cac3ad6a20131482b13373eab08258311cef961f7430"} Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.933441 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.933413615 podStartE2EDuration="2.933413615s" podCreationTimestamp="2026-01-29 03:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:50:02.913000242 +0000 UTC m=+1356.397229147" watchObservedRunningTime="2026-01-29 03:50:02.933413615 +0000 UTC m=+1356.417642530" Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.968846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.981718 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:50:02 crc kubenswrapper[4707]: I0129 03:50:02.999639 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:50:03 crc kubenswrapper[4707]: E0129 03:50:03.000182 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-metadata" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.000199 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-metadata" Jan 29 03:50:03 crc kubenswrapper[4707]: E0129 03:50:03.000230 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-log" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.000238 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-log" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.000425 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-metadata" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.000458 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" containerName="nova-metadata-log" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.001931 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.008206 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.008214 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.017768 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.086454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj52h\" (UniqueName: \"kubernetes.io/projected/df23cf25-bfda-4999-85bf-ef4af0738ece-kube-api-access-fj52h\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.086511 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df23cf25-bfda-4999-85bf-ef4af0738ece-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.086842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df23cf25-bfda-4999-85bf-ef4af0738ece-logs\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.087013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df23cf25-bfda-4999-85bf-ef4af0738ece-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.087250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df23cf25-bfda-4999-85bf-ef4af0738ece-config-data\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.190119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df23cf25-bfda-4999-85bf-ef4af0738ece-logs\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.190205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df23cf25-bfda-4999-85bf-ef4af0738ece-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.190360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df23cf25-bfda-4999-85bf-ef4af0738ece-config-data\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.190564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj52h\" (UniqueName: \"kubernetes.io/projected/df23cf25-bfda-4999-85bf-ef4af0738ece-kube-api-access-fj52h\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.190617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df23cf25-bfda-4999-85bf-ef4af0738ece-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.190707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df23cf25-bfda-4999-85bf-ef4af0738ece-logs\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.198448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df23cf25-bfda-4999-85bf-ef4af0738ece-config-data\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.199262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df23cf25-bfda-4999-85bf-ef4af0738ece-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.212393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df23cf25-bfda-4999-85bf-ef4af0738ece-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.214927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj52h\" (UniqueName: \"kubernetes.io/projected/df23cf25-bfda-4999-85bf-ef4af0738ece-kube-api-access-fj52h\") pod \"nova-metadata-0\" (UID: \"df23cf25-bfda-4999-85bf-ef4af0738ece\") " pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.257055 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d07ad17-c85d-4890-9636-b02564c1c482" path="/var/lib/kubelet/pods/1d07ad17-c85d-4890-9636-b02564c1c482/volumes" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.321637 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.463758 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.463843 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.463931 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.465516 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf0424c462083ab78d6f7ecb618cb72c194589faa9e4ad5d9079c9263ae5a027"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.465665 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://cf0424c462083ab78d6f7ecb618cb72c194589faa9e4ad5d9079c9263ae5a027" gracePeriod=600 Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.804889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 03:50:03 crc kubenswrapper[4707]: W0129 03:50:03.804896 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf23cf25_bfda_4999_85bf_ef4af0738ece.slice/crio-4bcb4c38760dbac127c402aa4dbc1a4663025e0ac0d47b5c7cb161e9c027027c WatchSource:0}: Error finding container 4bcb4c38760dbac127c402aa4dbc1a4663025e0ac0d47b5c7cb161e9c027027c: Status 404 returned error can't find the container with id 4bcb4c38760dbac127c402aa4dbc1a4663025e0ac0d47b5c7cb161e9c027027c Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.897650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df23cf25-bfda-4999-85bf-ef4af0738ece","Type":"ContainerStarted","Data":"4bcb4c38760dbac127c402aa4dbc1a4663025e0ac0d47b5c7cb161e9c027027c"} Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.902050 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="cf0424c462083ab78d6f7ecb618cb72c194589faa9e4ad5d9079c9263ae5a027" exitCode=0 Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.902095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"cf0424c462083ab78d6f7ecb618cb72c194589faa9e4ad5d9079c9263ae5a027"} Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.902149 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204"} Jan 29 03:50:03 crc kubenswrapper[4707]: I0129 03:50:03.902168 4707 scope.go:117] "RemoveContainer" containerID="b9348d06267b549d79524d7d6fb99695969175eb246c0104709c649f6ca1b571" Jan 29 03:50:04 crc kubenswrapper[4707]: I0129 03:50:04.917948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df23cf25-bfda-4999-85bf-ef4af0738ece","Type":"ContainerStarted","Data":"924e730f5c9907a8b19789e3140c36d9e26fc223460f732f68a92a77c42d1af8"} Jan 29 03:50:04 crc kubenswrapper[4707]: I0129 03:50:04.919251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df23cf25-bfda-4999-85bf-ef4af0738ece","Type":"ContainerStarted","Data":"e9d89a99a6e8481a99b5a7a346b7365f3444e83fb40b1e6272003b01bdadf939"} Jan 29 03:50:04 crc kubenswrapper[4707]: I0129 03:50:04.952625 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.952601214 podStartE2EDuration="2.952601214s" podCreationTimestamp="2026-01-29 03:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:50:04.93643318 +0000 UTC m=+1358.420662075" watchObservedRunningTime="2026-01-29 03:50:04.952601214 +0000 UTC m=+1358.436830119" Jan 29 03:50:06 crc kubenswrapper[4707]: I0129 03:50:06.308766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 03:50:08 crc kubenswrapper[4707]: I0129 03:50:08.321630 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 03:50:08 crc kubenswrapper[4707]: I0129 03:50:08.322367 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 03:50:10 crc kubenswrapper[4707]: I0129 03:50:10.273460 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 03:50:10 crc kubenswrapper[4707]: I0129 03:50:10.274298 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 03:50:11 crc kubenswrapper[4707]: I0129 03:50:11.308793 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 03:50:11 crc kubenswrapper[4707]: I0129 03:50:11.325735 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce279671-2df3-4af7-a6bb-2ac9fdc048da" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.223:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 03:50:11 crc kubenswrapper[4707]: I0129 03:50:11.325749 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce279671-2df3-4af7-a6bb-2ac9fdc048da" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.223:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 03:50:11 crc kubenswrapper[4707]: I0129 03:50:11.337446 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 03:50:12 crc kubenswrapper[4707]: I0129 03:50:12.088986 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 03:50:13 crc kubenswrapper[4707]: I0129 03:50:13.322345 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 03:50:13 crc kubenswrapper[4707]: I0129 03:50:13.322995 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 03:50:14 crc kubenswrapper[4707]: I0129 03:50:14.334840 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="df23cf25-bfda-4999-85bf-ef4af0738ece" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 03:50:14 crc kubenswrapper[4707]: I0129 03:50:14.334858 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="df23cf25-bfda-4999-85bf-ef4af0738ece" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.014147 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.282050 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.283463 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.285045 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.300181 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.751382 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kkv4h"] Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.753662 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.771796 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kkv4h"] Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.834424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-utilities\") pod \"redhat-operators-kkv4h\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.834489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgbg8\" (UniqueName: \"kubernetes.io/projected/fc41ae1b-9615-4c1c-b352-49196fc35cec-kube-api-access-zgbg8\") pod \"redhat-operators-kkv4h\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.834730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-catalog-content\") pod \"redhat-operators-kkv4h\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.937306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-catalog-content\") pod \"redhat-operators-kkv4h\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.937761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-utilities\") pod \"redhat-operators-kkv4h\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.937810 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgbg8\" (UniqueName: \"kubernetes.io/projected/fc41ae1b-9615-4c1c-b352-49196fc35cec-kube-api-access-zgbg8\") pod \"redhat-operators-kkv4h\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.938386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-catalog-content\") pod \"redhat-operators-kkv4h\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.938829 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-utilities\") pod \"redhat-operators-kkv4h\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:20 crc kubenswrapper[4707]: I0129 03:50:20.960304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgbg8\" (UniqueName: \"kubernetes.io/projected/fc41ae1b-9615-4c1c-b352-49196fc35cec-kube-api-access-zgbg8\") pod \"redhat-operators-kkv4h\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:21 crc kubenswrapper[4707]: I0129 03:50:21.126032 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 03:50:21 crc kubenswrapper[4707]: I0129 03:50:21.126106 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:21 crc kubenswrapper[4707]: I0129 03:50:21.135010 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 03:50:21 crc kubenswrapper[4707]: I0129 03:50:21.661867 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kkv4h"] Jan 29 03:50:21 crc kubenswrapper[4707]: W0129 03:50:21.678355 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc41ae1b_9615_4c1c_b352_49196fc35cec.slice/crio-8b1053db6d6a214978c33fa474218012a45a6c057622e99cda36d6b29f135a1e WatchSource:0}: Error finding container 8b1053db6d6a214978c33fa474218012a45a6c057622e99cda36d6b29f135a1e: Status 404 returned error can't find the container with id 8b1053db6d6a214978c33fa474218012a45a6c057622e99cda36d6b29f135a1e Jan 29 03:50:22 crc kubenswrapper[4707]: I0129 03:50:22.135743 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerID="b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee" exitCode=0 Jan 29 03:50:22 crc kubenswrapper[4707]: I0129 03:50:22.135863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkv4h" event={"ID":"fc41ae1b-9615-4c1c-b352-49196fc35cec","Type":"ContainerDied","Data":"b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee"} Jan 29 03:50:22 crc kubenswrapper[4707]: I0129 03:50:22.136098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkv4h" event={"ID":"fc41ae1b-9615-4c1c-b352-49196fc35cec","Type":"ContainerStarted","Data":"8b1053db6d6a214978c33fa474218012a45a6c057622e99cda36d6b29f135a1e"} Jan 29 03:50:23 crc kubenswrapper[4707]: I0129 03:50:23.148505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkv4h" event={"ID":"fc41ae1b-9615-4c1c-b352-49196fc35cec","Type":"ContainerStarted","Data":"688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789"} Jan 29 03:50:23 crc kubenswrapper[4707]: I0129 03:50:23.455670 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 03:50:23 crc kubenswrapper[4707]: I0129 03:50:23.463109 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 03:50:23 crc kubenswrapper[4707]: I0129 03:50:23.465508 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 03:50:24 crc kubenswrapper[4707]: I0129 03:50:24.171310 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 03:50:25 crc kubenswrapper[4707]: I0129 03:50:25.174081 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerID="688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789" exitCode=0 Jan 29 03:50:25 crc kubenswrapper[4707]: I0129 03:50:25.174142 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkv4h" event={"ID":"fc41ae1b-9615-4c1c-b352-49196fc35cec","Type":"ContainerDied","Data":"688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789"} Jan 29 03:50:27 crc kubenswrapper[4707]: I0129 03:50:27.220701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkv4h" event={"ID":"fc41ae1b-9615-4c1c-b352-49196fc35cec","Type":"ContainerStarted","Data":"461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a"} Jan 29 03:50:27 crc kubenswrapper[4707]: I0129 03:50:27.272783 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kkv4h" podStartSLOduration=3.314940802 podStartE2EDuration="7.272748913s" podCreationTimestamp="2026-01-29 03:50:20 +0000 UTC" firstStartedPulling="2026-01-29 03:50:22.137434839 +0000 UTC m=+1375.621663744" lastFinishedPulling="2026-01-29 03:50:26.09524291 +0000 UTC m=+1379.579471855" observedRunningTime="2026-01-29 03:50:27.259146934 +0000 UTC m=+1380.743375849" watchObservedRunningTime="2026-01-29 03:50:27.272748913 +0000 UTC m=+1380.756977828" Jan 29 03:50:31 crc kubenswrapper[4707]: I0129 03:50:31.126274 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:31 crc kubenswrapper[4707]: I0129 03:50:31.126690 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:50:31 crc kubenswrapper[4707]: I0129 03:50:31.643864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 03:50:32 crc kubenswrapper[4707]: I0129 03:50:32.177273 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kkv4h" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="registry-server" probeResult="failure" output=< Jan 29 03:50:32 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 03:50:32 crc kubenswrapper[4707]: > Jan 29 03:50:32 crc kubenswrapper[4707]: I0129 03:50:32.863763 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 03:50:36 crc kubenswrapper[4707]: I0129 03:50:36.058805 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d6e14cde-a343-4dc3-b429-77968ac0b7a5" containerName="rabbitmq" containerID="cri-o://e4a63754624f94ee910cfd791357cb62838ed5eec49488f409edb3aaf62a64ac" gracePeriod=604796 Jan 29 03:50:37 crc kubenswrapper[4707]: I0129 03:50:37.241006 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b8dec80d-f976-4316-9d4a-c18cbefe36ba" containerName="rabbitmq" containerID="cri-o://06e5b149d2301f0677cec170ad9b62ae8d43974756045dd69ee1c1c44a53baba" gracePeriod=604796 Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.183702 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kkv4h" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="registry-server" probeResult="failure" output=< Jan 29 03:50:42 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 03:50:42 crc kubenswrapper[4707]: > Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.420152 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6e14cde-a343-4dc3-b429-77968ac0b7a5" containerID="e4a63754624f94ee910cfd791357cb62838ed5eec49488f409edb3aaf62a64ac" exitCode=0 Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.420871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6e14cde-a343-4dc3-b429-77968ac0b7a5","Type":"ContainerDied","Data":"e4a63754624f94ee910cfd791357cb62838ed5eec49488f409edb3aaf62a64ac"} Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.689005 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.760692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-config-data\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.760858 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-server-conf\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.760895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-plugins-conf\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.760929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6e14cde-a343-4dc3-b429-77968ac0b7a5-pod-info\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.760951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-erlang-cookie\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.760972 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-confd\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.761014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-tls\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.761078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6e14cde-a343-4dc3-b429-77968ac0b7a5-erlang-cookie-secret\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.761105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7pvj\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-kube-api-access-f7pvj\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.761132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.761170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-plugins\") pod \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\" (UID: \"d6e14cde-a343-4dc3-b429-77968ac0b7a5\") " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.763222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.763447 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.763586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.772953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-kube-api-access-f7pvj" (OuterVolumeSpecName: "kube-api-access-f7pvj") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "kube-api-access-f7pvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.773051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.773079 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d6e14cde-a343-4dc3-b429-77968ac0b7a5-pod-info" (OuterVolumeSpecName: "pod-info") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.782234 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e14cde-a343-4dc3-b429-77968ac0b7a5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.782376 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.820394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-config-data" (OuterVolumeSpecName: "config-data") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.864131 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.864172 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.864184 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6e14cde-a343-4dc3-b429-77968ac0b7a5-pod-info\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.864193 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.864211 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.864223 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6e14cde-a343-4dc3-b429-77968ac0b7a5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.864235 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7pvj\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-kube-api-access-f7pvj\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.864271 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.864281 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.874431 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-server-conf" (OuterVolumeSpecName: "server-conf") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.886445 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.954194 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d6e14cde-a343-4dc3-b429-77968ac0b7a5" (UID: "d6e14cde-a343-4dc3-b429-77968ac0b7a5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.966822 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6e14cde-a343-4dc3-b429-77968ac0b7a5-server-conf\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.966871 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6e14cde-a343-4dc3-b429-77968ac0b7a5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:42 crc kubenswrapper[4707]: I0129 03:50:42.966884 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.439150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6e14cde-a343-4dc3-b429-77968ac0b7a5","Type":"ContainerDied","Data":"d6b4ebfd47893d08887693be3916da075be659eb379eae895103612113e9a77b"} Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.439228 4707 scope.go:117] "RemoveContainer" containerID="e4a63754624f94ee910cfd791357cb62838ed5eec49488f409edb3aaf62a64ac" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.439175 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.474816 4707 scope.go:117] "RemoveContainer" containerID="6503da289d12f1bfc63a47f7d284f1b2b501095f2d4223347f4b591fe440389f" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.476666 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.489429 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.509218 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 03:50:43 crc kubenswrapper[4707]: E0129 03:50:43.509738 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e14cde-a343-4dc3-b429-77968ac0b7a5" containerName="setup-container" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.509758 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e14cde-a343-4dc3-b429-77968ac0b7a5" containerName="setup-container" Jan 29 03:50:43 crc kubenswrapper[4707]: E0129 03:50:43.509789 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e14cde-a343-4dc3-b429-77968ac0b7a5" containerName="rabbitmq" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.509804 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e14cde-a343-4dc3-b429-77968ac0b7a5" containerName="rabbitmq" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.509996 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e14cde-a343-4dc3-b429-77968ac0b7a5" containerName="rabbitmq" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.511039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.515981 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.516308 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.516514 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.516789 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.516945 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bfks5" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.517675 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.518905 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.556753 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb587ed3-9015-4748-a28b-10d4132ffdfb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581263 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb587ed3-9015-4748-a28b-10d4132ffdfb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581371 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb587ed3-9015-4748-a28b-10d4132ffdfb-config-data\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb587ed3-9015-4748-a28b-10d4132ffdfb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krtnw\" (UniqueName: \"kubernetes.io/projected/fb587ed3-9015-4748-a28b-10d4132ffdfb-kube-api-access-krtnw\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb587ed3-9015-4748-a28b-10d4132ffdfb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581500 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.581523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.683708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.683796 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.683826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb587ed3-9015-4748-a28b-10d4132ffdfb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.683872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb587ed3-9015-4748-a28b-10d4132ffdfb-config-data\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.683917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb587ed3-9015-4748-a28b-10d4132ffdfb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.683960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krtnw\" (UniqueName: \"kubernetes.io/projected/fb587ed3-9015-4748-a28b-10d4132ffdfb-kube-api-access-krtnw\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.683992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb587ed3-9015-4748-a28b-10d4132ffdfb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.684016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.684037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.684066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb587ed3-9015-4748-a28b-10d4132ffdfb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.684089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.684866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.685180 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.685578 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb587ed3-9015-4748-a28b-10d4132ffdfb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.686288 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb587ed3-9015-4748-a28b-10d4132ffdfb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.686598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb587ed3-9015-4748-a28b-10d4132ffdfb-config-data\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.688755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.689944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.690661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb587ed3-9015-4748-a28b-10d4132ffdfb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.694347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb587ed3-9015-4748-a28b-10d4132ffdfb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.696348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb587ed3-9015-4748-a28b-10d4132ffdfb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.705126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krtnw\" (UniqueName: \"kubernetes.io/projected/fb587ed3-9015-4748-a28b-10d4132ffdfb-kube-api-access-krtnw\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.741144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"fb587ed3-9015-4748-a28b-10d4132ffdfb\") " pod="openstack/rabbitmq-server-0" Jan 29 03:50:43 crc kubenswrapper[4707]: I0129 03:50:43.879419 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.348729 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.455314 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8dec80d-f976-4316-9d4a-c18cbefe36ba" containerID="06e5b149d2301f0677cec170ad9b62ae8d43974756045dd69ee1c1c44a53baba" exitCode=0 Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.455405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8dec80d-f976-4316-9d4a-c18cbefe36ba","Type":"ContainerDied","Data":"06e5b149d2301f0677cec170ad9b62ae8d43974756045dd69ee1c1c44a53baba"} Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.474621 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb587ed3-9015-4748-a28b-10d4132ffdfb","Type":"ContainerStarted","Data":"8da7cced378ef3ab2a7e254b032d72d06b6beb1c434de924f0ed280c2fd66ae6"} Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.653176 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.711904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8dec80d-f976-4316-9d4a-c18cbefe36ba-erlang-cookie-secret\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.711993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.712038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-plugins\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.712076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqnhc\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-kube-api-access-bqnhc\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.712128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-config-data\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.712162 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-server-conf\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.712201 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-plugins-conf\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.712288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-erlang-cookie\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.712323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8dec80d-f976-4316-9d4a-c18cbefe36ba-pod-info\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.712375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-confd\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.712413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-tls\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.715390 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.720649 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8dec80d-f976-4316-9d4a-c18cbefe36ba-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.720989 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.721691 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.722387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-kube-api-access-bqnhc" (OuterVolumeSpecName: "kube-api-access-bqnhc") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "kube-api-access-bqnhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.722450 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.729550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.729715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b8dec80d-f976-4316-9d4a-c18cbefe36ba-pod-info" (OuterVolumeSpecName: "pod-info") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.766213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-config-data" (OuterVolumeSpecName: "config-data") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.777150 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-fsh5f"] Jan 29 03:50:44 crc kubenswrapper[4707]: E0129 03:50:44.777905 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8dec80d-f976-4316-9d4a-c18cbefe36ba" containerName="setup-container" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.777939 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8dec80d-f976-4316-9d4a-c18cbefe36ba" containerName="setup-container" Jan 29 03:50:44 crc kubenswrapper[4707]: E0129 03:50:44.777961 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8dec80d-f976-4316-9d4a-c18cbefe36ba" containerName="rabbitmq" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.777968 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8dec80d-f976-4316-9d4a-c18cbefe36ba" containerName="rabbitmq" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.778198 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8dec80d-f976-4316-9d4a-c18cbefe36ba" containerName="rabbitmq" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.779613 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.786921 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.806804 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-fsh5f"] Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.814169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-server-conf" (OuterVolumeSpecName: "server-conf") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: W0129 03:50:44.816396 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b8dec80d-f976-4316-9d4a-c18cbefe36ba/volumes/kubernetes.io~configmap/server-conf Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.816416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-server-conf" (OuterVolumeSpecName: "server-conf") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.816573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-server-conf\") pod \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\" (UID: \"b8dec80d-f976-4316-9d4a-c18cbefe36ba\") " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.820555 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6cl\" (UniqueName: \"kubernetes.io/projected/a536bc34-4f68-4f0e-ba61-3672fcffaacf-kube-api-access-vp6cl\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.820696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.820740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-config\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.820911 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.822364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.822596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.822653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.823169 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.824342 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.824390 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqnhc\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-kube-api-access-bqnhc\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.824408 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.824423 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-server-conf\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.824437 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8dec80d-f976-4316-9d4a-c18cbefe36ba-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.824453 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.824466 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8dec80d-f976-4316-9d4a-c18cbefe36ba-pod-info\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.824479 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.824491 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8dec80d-f976-4316-9d4a-c18cbefe36ba-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.862366 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.878863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b8dec80d-f976-4316-9d4a-c18cbefe36ba" (UID: "b8dec80d-f976-4316-9d4a-c18cbefe36ba"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.926335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.926457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.926492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.926516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.926579 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6cl\" (UniqueName: \"kubernetes.io/projected/a536bc34-4f68-4f0e-ba61-3672fcffaacf-kube-api-access-vp6cl\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.926615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.926636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-config\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.926738 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8dec80d-f976-4316-9d4a-c18cbefe36ba-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.926753 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.927427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.927445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.928010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.928122 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.928422 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-config\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.928481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:44 crc kubenswrapper[4707]: I0129 03:50:44.947363 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6cl\" (UniqueName: \"kubernetes.io/projected/a536bc34-4f68-4f0e-ba61-3672fcffaacf-kube-api-access-vp6cl\") pod \"dnsmasq-dns-7d84b4d45c-fsh5f\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.110474 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.273785 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e14cde-a343-4dc3-b429-77968ac0b7a5" path="/var/lib/kubelet/pods/d6e14cde-a343-4dc3-b429-77968ac0b7a5/volumes" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.490779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8dec80d-f976-4316-9d4a-c18cbefe36ba","Type":"ContainerDied","Data":"30e28dd9d5fd9cb817c6bf9759a2dec425a1ed67dde8c6f96b26e7424f9d44d4"} Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.490825 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.491158 4707 scope.go:117] "RemoveContainer" containerID="06e5b149d2301f0677cec170ad9b62ae8d43974756045dd69ee1c1c44a53baba" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.528406 4707 scope.go:117] "RemoveContainer" containerID="6d5d6d2e0fa06b8db3118e28e603b5b03f428a0b7fde78b5b540f4727f0a499a" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.537167 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.574277 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.589396 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.591091 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.591186 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.598227 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-bglwv" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.598258 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.598313 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.598343 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.598258 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.598426 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.598495 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/688208b9-5567-4a47-9ec9-76ce03ec8991-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/688208b9-5567-4a47-9ec9-76ce03ec8991-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645524 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/688208b9-5567-4a47-9ec9-76ce03ec8991-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqk2\" (UniqueName: \"kubernetes.io/projected/688208b9-5567-4a47-9ec9-76ce03ec8991-kube-api-access-6jqk2\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645694 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/688208b9-5567-4a47-9ec9-76ce03ec8991-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.645756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/688208b9-5567-4a47-9ec9-76ce03ec8991-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.714181 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-fsh5f"] Jan 29 03:50:45 crc kubenswrapper[4707]: W0129 03:50:45.720649 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda536bc34_4f68_4f0e_ba61_3672fcffaacf.slice/crio-864a0972233b8fd1373e0d91d4767848c29131fb743dfa7c02d2d1151d38519b WatchSource:0}: Error finding container 864a0972233b8fd1373e0d91d4767848c29131fb743dfa7c02d2d1151d38519b: Status 404 returned error can't find the container with id 864a0972233b8fd1373e0d91d4767848c29131fb743dfa7c02d2d1151d38519b Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/688208b9-5567-4a47-9ec9-76ce03ec8991-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/688208b9-5567-4a47-9ec9-76ce03ec8991-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/688208b9-5567-4a47-9ec9-76ce03ec8991-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqk2\" (UniqueName: \"kubernetes.io/projected/688208b9-5567-4a47-9ec9-76ce03ec8991-kube-api-access-6jqk2\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747677 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/688208b9-5567-4a47-9ec9-76ce03ec8991-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.747701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/688208b9-5567-4a47-9ec9-76ce03ec8991-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.748100 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.750381 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.750421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.751671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/688208b9-5567-4a47-9ec9-76ce03ec8991-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.752670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/688208b9-5567-4a47-9ec9-76ce03ec8991-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.753164 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/688208b9-5567-4a47-9ec9-76ce03ec8991-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.775254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.814724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.815289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/688208b9-5567-4a47-9ec9-76ce03ec8991-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.815390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/688208b9-5567-4a47-9ec9-76ce03ec8991-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.815522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/688208b9-5567-4a47-9ec9-76ce03ec8991-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:45 crc kubenswrapper[4707]: I0129 03:50:45.826408 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqk2\" (UniqueName: \"kubernetes.io/projected/688208b9-5567-4a47-9ec9-76ce03ec8991-kube-api-access-6jqk2\") pod \"rabbitmq-cell1-server-0\" (UID: \"688208b9-5567-4a47-9ec9-76ce03ec8991\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:46 crc kubenswrapper[4707]: I0129 03:50:46.004242 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:50:46 crc kubenswrapper[4707]: I0129 03:50:46.508349 4707 generic.go:334] "Generic (PLEG): container finished" podID="a536bc34-4f68-4f0e-ba61-3672fcffaacf" containerID="50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549" exitCode=0 Jan 29 03:50:46 crc kubenswrapper[4707]: I0129 03:50:46.508631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" event={"ID":"a536bc34-4f68-4f0e-ba61-3672fcffaacf","Type":"ContainerDied","Data":"50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549"} Jan 29 03:50:46 crc kubenswrapper[4707]: I0129 03:50:46.508750 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" event={"ID":"a536bc34-4f68-4f0e-ba61-3672fcffaacf","Type":"ContainerStarted","Data":"864a0972233b8fd1373e0d91d4767848c29131fb743dfa7c02d2d1151d38519b"} Jan 29 03:50:46 crc kubenswrapper[4707]: I0129 03:50:46.513420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb587ed3-9015-4748-a28b-10d4132ffdfb","Type":"ContainerStarted","Data":"6eeb1c385c96c6cc3b01e6b77c0e2a06459619465c90cbdc7c73439c5282b464"} Jan 29 03:50:46 crc kubenswrapper[4707]: I0129 03:50:46.594501 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 03:50:46 crc kubenswrapper[4707]: W0129 03:50:46.598852 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod688208b9_5567_4a47_9ec9_76ce03ec8991.slice/crio-df197c891b55230479f6f04880d5b3cd5c89fddf23a99f9b500f23a5c006ebc4 WatchSource:0}: Error finding container df197c891b55230479f6f04880d5b3cd5c89fddf23a99f9b500f23a5c006ebc4: Status 404 returned error can't find the container with id df197c891b55230479f6f04880d5b3cd5c89fddf23a99f9b500f23a5c006ebc4 Jan 29 03:50:47 crc kubenswrapper[4707]: I0129 03:50:47.261246 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8dec80d-f976-4316-9d4a-c18cbefe36ba" path="/var/lib/kubelet/pods/b8dec80d-f976-4316-9d4a-c18cbefe36ba/volumes" Jan 29 03:50:47 crc kubenswrapper[4707]: I0129 03:50:47.528997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"688208b9-5567-4a47-9ec9-76ce03ec8991","Type":"ContainerStarted","Data":"df197c891b55230479f6f04880d5b3cd5c89fddf23a99f9b500f23a5c006ebc4"} Jan 29 03:50:47 crc kubenswrapper[4707]: I0129 03:50:47.531685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" event={"ID":"a536bc34-4f68-4f0e-ba61-3672fcffaacf","Type":"ContainerStarted","Data":"985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c"} Jan 29 03:50:47 crc kubenswrapper[4707]: I0129 03:50:47.532348 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:47 crc kubenswrapper[4707]: I0129 03:50:47.555238 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" podStartSLOduration=3.55521311 podStartE2EDuration="3.55521311s" podCreationTimestamp="2026-01-29 03:50:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:50:47.554337285 +0000 UTC m=+1401.038566190" watchObservedRunningTime="2026-01-29 03:50:47.55521311 +0000 UTC m=+1401.039442025" Jan 29 03:50:48 crc kubenswrapper[4707]: I0129 03:50:48.543101 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"688208b9-5567-4a47-9ec9-76ce03ec8991","Type":"ContainerStarted","Data":"d33e5ae8ee9f0ac2014772a9c8e416731636c292ecbda4123adbff8ae6aa82e0"} Jan 29 03:50:52 crc kubenswrapper[4707]: I0129 03:50:52.183595 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kkv4h" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="registry-server" probeResult="failure" output=< Jan 29 03:50:52 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 03:50:52 crc kubenswrapper[4707]: > Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.112895 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.224198 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q"] Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.225634 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" podUID="c3cea440-5000-4c97-96ab-6436f2a69e02" containerName="dnsmasq-dns" containerID="cri-o://ac6a46f30ac327783b2b2ce5e8c1294e97aee8f18f614a5b0fcc6c2e2b0f9762" gracePeriod=10 Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.494554 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-4wd7q"] Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.496307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.529470 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-4wd7q"] Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.532906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.532996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-config\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.533032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.533060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvsp2\" (UniqueName: \"kubernetes.io/projected/fdf17e92-84cc-4d06-ba4f-714cfd41c134-kube-api-access-mvsp2\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.533081 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.533129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.533152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.617304 4707 generic.go:334] "Generic (PLEG): container finished" podID="c3cea440-5000-4c97-96ab-6436f2a69e02" containerID="ac6a46f30ac327783b2b2ce5e8c1294e97aee8f18f614a5b0fcc6c2e2b0f9762" exitCode=0 Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.617594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" event={"ID":"c3cea440-5000-4c97-96ab-6436f2a69e02","Type":"ContainerDied","Data":"ac6a46f30ac327783b2b2ce5e8c1294e97aee8f18f614a5b0fcc6c2e2b0f9762"} Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.635121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-config\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.635197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.635237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvsp2\" (UniqueName: \"kubernetes.io/projected/fdf17e92-84cc-4d06-ba4f-714cfd41c134-kube-api-access-mvsp2\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.635284 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.635355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.635389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.635454 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.636413 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.637006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-config\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.637556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.638364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.638919 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.639505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fdf17e92-84cc-4d06-ba4f-714cfd41c134-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.662167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvsp2\" (UniqueName: \"kubernetes.io/projected/fdf17e92-84cc-4d06-ba4f-714cfd41c134-kube-api-access-mvsp2\") pod \"dnsmasq-dns-6f6df4f56c-4wd7q\" (UID: \"fdf17e92-84cc-4d06-ba4f-714cfd41c134\") " pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.762129 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.829807 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.840793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-config\") pod \"c3cea440-5000-4c97-96ab-6436f2a69e02\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.841055 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-swift-storage-0\") pod \"c3cea440-5000-4c97-96ab-6436f2a69e02\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.841087 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57g28\" (UniqueName: \"kubernetes.io/projected/c3cea440-5000-4c97-96ab-6436f2a69e02-kube-api-access-57g28\") pod \"c3cea440-5000-4c97-96ab-6436f2a69e02\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.841110 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-nb\") pod \"c3cea440-5000-4c97-96ab-6436f2a69e02\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.841205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-svc\") pod \"c3cea440-5000-4c97-96ab-6436f2a69e02\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.841265 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-sb\") pod \"c3cea440-5000-4c97-96ab-6436f2a69e02\" (UID: \"c3cea440-5000-4c97-96ab-6436f2a69e02\") " Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.865464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3cea440-5000-4c97-96ab-6436f2a69e02-kube-api-access-57g28" (OuterVolumeSpecName: "kube-api-access-57g28") pod "c3cea440-5000-4c97-96ab-6436f2a69e02" (UID: "c3cea440-5000-4c97-96ab-6436f2a69e02"). InnerVolumeSpecName "kube-api-access-57g28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.897195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c3cea440-5000-4c97-96ab-6436f2a69e02" (UID: "c3cea440-5000-4c97-96ab-6436f2a69e02"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.902455 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-config" (OuterVolumeSpecName: "config") pod "c3cea440-5000-4c97-96ab-6436f2a69e02" (UID: "c3cea440-5000-4c97-96ab-6436f2a69e02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.905733 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3cea440-5000-4c97-96ab-6436f2a69e02" (UID: "c3cea440-5000-4c97-96ab-6436f2a69e02"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.906654 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3cea440-5000-4c97-96ab-6436f2a69e02" (UID: "c3cea440-5000-4c97-96ab-6436f2a69e02"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.910305 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c3cea440-5000-4c97-96ab-6436f2a69e02" (UID: "c3cea440-5000-4c97-96ab-6436f2a69e02"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.944221 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.944604 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.944618 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.944627 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.944637 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57g28\" (UniqueName: \"kubernetes.io/projected/c3cea440-5000-4c97-96ab-6436f2a69e02-kube-api-access-57g28\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:55 crc kubenswrapper[4707]: I0129 03:50:55.944645 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3cea440-5000-4c97-96ab-6436f2a69e02-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:50:56 crc kubenswrapper[4707]: I0129 03:50:56.345716 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-4wd7q"] Jan 29 03:50:56 crc kubenswrapper[4707]: I0129 03:50:56.629679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" event={"ID":"fdf17e92-84cc-4d06-ba4f-714cfd41c134","Type":"ContainerStarted","Data":"80c5b65d44e07fda486c7d3bd355791d72944554b0d0a4064106a9480efd7156"} Jan 29 03:50:56 crc kubenswrapper[4707]: I0129 03:50:56.631969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" event={"ID":"c3cea440-5000-4c97-96ab-6436f2a69e02","Type":"ContainerDied","Data":"b73f6c28a966b865269e061eaed71a049eb8806b4d38467f011e2cfa1eac16ce"} Jan 29 03:50:56 crc kubenswrapper[4707]: I0129 03:50:56.632067 4707 scope.go:117] "RemoveContainer" containerID="ac6a46f30ac327783b2b2ce5e8c1294e97aee8f18f614a5b0fcc6c2e2b0f9762" Jan 29 03:50:56 crc kubenswrapper[4707]: I0129 03:50:56.632363 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q" Jan 29 03:50:56 crc kubenswrapper[4707]: I0129 03:50:56.662855 4707 scope.go:117] "RemoveContainer" containerID="c2799b8df9451fead2668b0350f816d18b6885f5f3533e4cbc943f7943c22b99" Jan 29 03:50:56 crc kubenswrapper[4707]: I0129 03:50:56.690045 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q"] Jan 29 03:50:56 crc kubenswrapper[4707]: I0129 03:50:56.699274 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hzv6q"] Jan 29 03:50:57 crc kubenswrapper[4707]: I0129 03:50:57.255476 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3cea440-5000-4c97-96ab-6436f2a69e02" path="/var/lib/kubelet/pods/c3cea440-5000-4c97-96ab-6436f2a69e02/volumes" Jan 29 03:50:57 crc kubenswrapper[4707]: I0129 03:50:57.644865 4707 generic.go:334] "Generic (PLEG): container finished" podID="fdf17e92-84cc-4d06-ba4f-714cfd41c134" containerID="2c80070de92dfb0134749dec176b633804cdc68ff3d375dc88d9c8be5a0dfad1" exitCode=0 Jan 29 03:50:57 crc kubenswrapper[4707]: I0129 03:50:57.644960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" event={"ID":"fdf17e92-84cc-4d06-ba4f-714cfd41c134","Type":"ContainerDied","Data":"2c80070de92dfb0134749dec176b633804cdc68ff3d375dc88d9c8be5a0dfad1"} Jan 29 03:50:58 crc kubenswrapper[4707]: I0129 03:50:58.661808 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" event={"ID":"fdf17e92-84cc-4d06-ba4f-714cfd41c134","Type":"ContainerStarted","Data":"5a4dcb3457e71c091d4b1c47697fb4a13523ebe4b3081fef36c7d21fe25e3d40"} Jan 29 03:50:58 crc kubenswrapper[4707]: I0129 03:50:58.662037 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:50:58 crc kubenswrapper[4707]: I0129 03:50:58.688818 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" podStartSLOduration=3.688782947 podStartE2EDuration="3.688782947s" podCreationTimestamp="2026-01-29 03:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:50:58.686150419 +0000 UTC m=+1412.170379334" watchObservedRunningTime="2026-01-29 03:50:58.688782947 +0000 UTC m=+1412.173011862" Jan 29 03:51:01 crc kubenswrapper[4707]: I0129 03:51:01.197506 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:51:01 crc kubenswrapper[4707]: I0129 03:51:01.270334 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:51:01 crc kubenswrapper[4707]: I0129 03:51:01.450638 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kkv4h"] Jan 29 03:51:02 crc kubenswrapper[4707]: I0129 03:51:02.703291 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kkv4h" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="registry-server" containerID="cri-o://461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a" gracePeriod=2 Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.252064 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.308058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-catalog-content\") pod \"fc41ae1b-9615-4c1c-b352-49196fc35cec\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.308420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-utilities\") pod \"fc41ae1b-9615-4c1c-b352-49196fc35cec\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.309487 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-utilities" (OuterVolumeSpecName: "utilities") pod "fc41ae1b-9615-4c1c-b352-49196fc35cec" (UID: "fc41ae1b-9615-4c1c-b352-49196fc35cec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.410163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgbg8\" (UniqueName: \"kubernetes.io/projected/fc41ae1b-9615-4c1c-b352-49196fc35cec-kube-api-access-zgbg8\") pod \"fc41ae1b-9615-4c1c-b352-49196fc35cec\" (UID: \"fc41ae1b-9615-4c1c-b352-49196fc35cec\") " Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.410792 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.417737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc41ae1b-9615-4c1c-b352-49196fc35cec-kube-api-access-zgbg8" (OuterVolumeSpecName: "kube-api-access-zgbg8") pod "fc41ae1b-9615-4c1c-b352-49196fc35cec" (UID: "fc41ae1b-9615-4c1c-b352-49196fc35cec"). InnerVolumeSpecName "kube-api-access-zgbg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.438597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc41ae1b-9615-4c1c-b352-49196fc35cec" (UID: "fc41ae1b-9615-4c1c-b352-49196fc35cec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.514034 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgbg8\" (UniqueName: \"kubernetes.io/projected/fc41ae1b-9615-4c1c-b352-49196fc35cec-kube-api-access-zgbg8\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.514079 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc41ae1b-9615-4c1c-b352-49196fc35cec-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.723420 4707 generic.go:334] "Generic (PLEG): container finished" podID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerID="461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a" exitCode=0 Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.723550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkv4h" event={"ID":"fc41ae1b-9615-4c1c-b352-49196fc35cec","Type":"ContainerDied","Data":"461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a"} Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.723599 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kkv4h" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.723629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kkv4h" event={"ID":"fc41ae1b-9615-4c1c-b352-49196fc35cec","Type":"ContainerDied","Data":"8b1053db6d6a214978c33fa474218012a45a6c057622e99cda36d6b29f135a1e"} Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.723651 4707 scope.go:117] "RemoveContainer" containerID="461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.757465 4707 scope.go:117] "RemoveContainer" containerID="688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.776146 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kkv4h"] Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.784489 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kkv4h"] Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.808478 4707 scope.go:117] "RemoveContainer" containerID="b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.871482 4707 scope.go:117] "RemoveContainer" containerID="461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a" Jan 29 03:51:03 crc kubenswrapper[4707]: E0129 03:51:03.872241 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a\": container with ID starting with 461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a not found: ID does not exist" containerID="461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.872281 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a"} err="failed to get container status \"461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a\": rpc error: code = NotFound desc = could not find container \"461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a\": container with ID starting with 461a954ac86608c0a24c6f43438dbd3a45eca6e819cce9d3078a6b5d5e6db86a not found: ID does not exist" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.872312 4707 scope.go:117] "RemoveContainer" containerID="688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789" Jan 29 03:51:03 crc kubenswrapper[4707]: E0129 03:51:03.872678 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789\": container with ID starting with 688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789 not found: ID does not exist" containerID="688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.872731 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789"} err="failed to get container status \"688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789\": rpc error: code = NotFound desc = could not find container \"688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789\": container with ID starting with 688a8fe774d0697e3ed605d4f25070b8a8c809f68e3c0412063c9c2034fb8789 not found: ID does not exist" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.872770 4707 scope.go:117] "RemoveContainer" containerID="b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee" Jan 29 03:51:03 crc kubenswrapper[4707]: E0129 03:51:03.873324 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee\": container with ID starting with b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee not found: ID does not exist" containerID="b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee" Jan 29 03:51:03 crc kubenswrapper[4707]: I0129 03:51:03.873355 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee"} err="failed to get container status \"b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee\": rpc error: code = NotFound desc = could not find container \"b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee\": container with ID starting with b52790bda3edf72695f437aa24c9c255ee35a08c65a84ce5b394eadb2c9ab0ee not found: ID does not exist" Jan 29 03:51:05 crc kubenswrapper[4707]: I0129 03:51:05.278586 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" path="/var/lib/kubelet/pods/fc41ae1b-9615-4c1c-b352-49196fc35cec/volumes" Jan 29 03:51:05 crc kubenswrapper[4707]: I0129 03:51:05.831521 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-4wd7q" Jan 29 03:51:05 crc kubenswrapper[4707]: I0129 03:51:05.913996 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-fsh5f"] Jan 29 03:51:05 crc kubenswrapper[4707]: I0129 03:51:05.914396 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" podUID="a536bc34-4f68-4f0e-ba61-3672fcffaacf" containerName="dnsmasq-dns" containerID="cri-o://985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c" gracePeriod=10 Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.488619 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.498248 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-svc\") pod \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.498296 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-swift-storage-0\") pod \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.498339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-config\") pod \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.498357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-sb\") pod \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.498433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-openstack-edpm-ipam\") pod \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.498487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp6cl\" (UniqueName: \"kubernetes.io/projected/a536bc34-4f68-4f0e-ba61-3672fcffaacf-kube-api-access-vp6cl\") pod \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.498523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-nb\") pod \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\" (UID: \"a536bc34-4f68-4f0e-ba61-3672fcffaacf\") " Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.525341 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a536bc34-4f68-4f0e-ba61-3672fcffaacf-kube-api-access-vp6cl" (OuterVolumeSpecName: "kube-api-access-vp6cl") pod "a536bc34-4f68-4f0e-ba61-3672fcffaacf" (UID: "a536bc34-4f68-4f0e-ba61-3672fcffaacf"). InnerVolumeSpecName "kube-api-access-vp6cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.573521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a536bc34-4f68-4f0e-ba61-3672fcffaacf" (UID: "a536bc34-4f68-4f0e-ba61-3672fcffaacf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.579319 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a536bc34-4f68-4f0e-ba61-3672fcffaacf" (UID: "a536bc34-4f68-4f0e-ba61-3672fcffaacf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.583025 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a536bc34-4f68-4f0e-ba61-3672fcffaacf" (UID: "a536bc34-4f68-4f0e-ba61-3672fcffaacf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.590377 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a536bc34-4f68-4f0e-ba61-3672fcffaacf" (UID: "a536bc34-4f68-4f0e-ba61-3672fcffaacf"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.591489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a536bc34-4f68-4f0e-ba61-3672fcffaacf" (UID: "a536bc34-4f68-4f0e-ba61-3672fcffaacf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.591991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-config" (OuterVolumeSpecName: "config") pod "a536bc34-4f68-4f0e-ba61-3672fcffaacf" (UID: "a536bc34-4f68-4f0e-ba61-3672fcffaacf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.601177 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp6cl\" (UniqueName: \"kubernetes.io/projected/a536bc34-4f68-4f0e-ba61-3672fcffaacf-kube-api-access-vp6cl\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.601310 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.601373 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.601440 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.601503 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-config\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.601588 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.601654 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a536bc34-4f68-4f0e-ba61-3672fcffaacf-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.762702 4707 generic.go:334] "Generic (PLEG): container finished" podID="a536bc34-4f68-4f0e-ba61-3672fcffaacf" containerID="985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c" exitCode=0 Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.762758 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" event={"ID":"a536bc34-4f68-4f0e-ba61-3672fcffaacf","Type":"ContainerDied","Data":"985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c"} Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.762797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" event={"ID":"a536bc34-4f68-4f0e-ba61-3672fcffaacf","Type":"ContainerDied","Data":"864a0972233b8fd1373e0d91d4767848c29131fb743dfa7c02d2d1151d38519b"} Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.762817 4707 scope.go:117] "RemoveContainer" containerID="985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.762841 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-fsh5f" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.785192 4707 scope.go:117] "RemoveContainer" containerID="50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.806848 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-fsh5f"] Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.817032 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-fsh5f"] Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.822579 4707 scope.go:117] "RemoveContainer" containerID="985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c" Jan 29 03:51:06 crc kubenswrapper[4707]: E0129 03:51:06.823089 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c\": container with ID starting with 985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c not found: ID does not exist" containerID="985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.823138 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c"} err="failed to get container status \"985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c\": rpc error: code = NotFound desc = could not find container \"985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c\": container with ID starting with 985f107ef417565bd7dae69a18ee966093b0f3f7ecc0ea91a47532adf07fc63c not found: ID does not exist" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.823167 4707 scope.go:117] "RemoveContainer" containerID="50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549" Jan 29 03:51:06 crc kubenswrapper[4707]: E0129 03:51:06.823415 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549\": container with ID starting with 50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549 not found: ID does not exist" containerID="50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549" Jan 29 03:51:06 crc kubenswrapper[4707]: I0129 03:51:06.823444 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549"} err="failed to get container status \"50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549\": rpc error: code = NotFound desc = could not find container \"50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549\": container with ID starting with 50fcf0468a7580238b7a27b9ddbba50ba635befd89e71924db73cddc6c1f9549 not found: ID does not exist" Jan 29 03:51:07 crc kubenswrapper[4707]: I0129 03:51:07.253914 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a536bc34-4f68-4f0e-ba61-3672fcffaacf" path="/var/lib/kubelet/pods/a536bc34-4f68-4f0e-ba61-3672fcffaacf/volumes" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.604779 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s"] Jan 29 03:51:14 crc kubenswrapper[4707]: E0129 03:51:14.613987 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3cea440-5000-4c97-96ab-6436f2a69e02" containerName="dnsmasq-dns" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614031 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cea440-5000-4c97-96ab-6436f2a69e02" containerName="dnsmasq-dns" Jan 29 03:51:14 crc kubenswrapper[4707]: E0129 03:51:14.614100 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3cea440-5000-4c97-96ab-6436f2a69e02" containerName="init" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614107 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cea440-5000-4c97-96ab-6436f2a69e02" containerName="init" Jan 29 03:51:14 crc kubenswrapper[4707]: E0129 03:51:14.614131 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="registry-server" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614137 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="registry-server" Jan 29 03:51:14 crc kubenswrapper[4707]: E0129 03:51:14.614156 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="extract-content" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614162 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="extract-content" Jan 29 03:51:14 crc kubenswrapper[4707]: E0129 03:51:14.614171 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a536bc34-4f68-4f0e-ba61-3672fcffaacf" containerName="init" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614177 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a536bc34-4f68-4f0e-ba61-3672fcffaacf" containerName="init" Jan 29 03:51:14 crc kubenswrapper[4707]: E0129 03:51:14.614185 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a536bc34-4f68-4f0e-ba61-3672fcffaacf" containerName="dnsmasq-dns" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614193 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a536bc34-4f68-4f0e-ba61-3672fcffaacf" containerName="dnsmasq-dns" Jan 29 03:51:14 crc kubenswrapper[4707]: E0129 03:51:14.614215 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="extract-utilities" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614221 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="extract-utilities" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614611 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a536bc34-4f68-4f0e-ba61-3672fcffaacf" containerName="dnsmasq-dns" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614626 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3cea440-5000-4c97-96ab-6436f2a69e02" containerName="dnsmasq-dns" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.614674 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc41ae1b-9615-4c1c-b352-49196fc35cec" containerName="registry-server" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.615460 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.617447 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s"] Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.617747 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.618129 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.618145 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.618187 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.787004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.787066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.787124 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.787410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r62kp\" (UniqueName: \"kubernetes.io/projected/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-kube-api-access-r62kp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.890098 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.890871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.891031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.891235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r62kp\" (UniqueName: \"kubernetes.io/projected/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-kube-api-access-r62kp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.895890 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.898221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.900293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.908848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r62kp\" (UniqueName: \"kubernetes.io/projected/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-kube-api-access-r62kp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:14 crc kubenswrapper[4707]: I0129 03:51:14.964641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:15 crc kubenswrapper[4707]: I0129 03:51:15.492663 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s"] Jan 29 03:51:15 crc kubenswrapper[4707]: W0129 03:51:15.497758 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b1ed1fd_6748_40d9_b458_ca63cb4479e0.slice/crio-ee5cce2ab37574b29111217de66e6c6909422866a5da8161197ddc83ee1d1c50 WatchSource:0}: Error finding container ee5cce2ab37574b29111217de66e6c6909422866a5da8161197ddc83ee1d1c50: Status 404 returned error can't find the container with id ee5cce2ab37574b29111217de66e6c6909422866a5da8161197ddc83ee1d1c50 Jan 29 03:51:15 crc kubenswrapper[4707]: I0129 03:51:15.856053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" event={"ID":"3b1ed1fd-6748-40d9-b458-ca63cb4479e0","Type":"ContainerStarted","Data":"ee5cce2ab37574b29111217de66e6c6909422866a5da8161197ddc83ee1d1c50"} Jan 29 03:51:18 crc kubenswrapper[4707]: I0129 03:51:18.892441 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb587ed3-9015-4748-a28b-10d4132ffdfb" containerID="6eeb1c385c96c6cc3b01e6b77c0e2a06459619465c90cbdc7c73439c5282b464" exitCode=0 Jan 29 03:51:18 crc kubenswrapper[4707]: I0129 03:51:18.892565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb587ed3-9015-4748-a28b-10d4132ffdfb","Type":"ContainerDied","Data":"6eeb1c385c96c6cc3b01e6b77c0e2a06459619465c90cbdc7c73439c5282b464"} Jan 29 03:51:20 crc kubenswrapper[4707]: I0129 03:51:20.914246 4707 generic.go:334] "Generic (PLEG): container finished" podID="688208b9-5567-4a47-9ec9-76ce03ec8991" containerID="d33e5ae8ee9f0ac2014772a9c8e416731636c292ecbda4123adbff8ae6aa82e0" exitCode=0 Jan 29 03:51:20 crc kubenswrapper[4707]: I0129 03:51:20.914340 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"688208b9-5567-4a47-9ec9-76ce03ec8991","Type":"ContainerDied","Data":"d33e5ae8ee9f0ac2014772a9c8e416731636c292ecbda4123adbff8ae6aa82e0"} Jan 29 03:51:23 crc kubenswrapper[4707]: I0129 03:51:23.947287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fb587ed3-9015-4748-a28b-10d4132ffdfb","Type":"ContainerStarted","Data":"a51ba4ae70220444c4cbe5a110513277dedfcf8db6ecea2c1cb083f1d5e6cd9e"} Jan 29 03:51:23 crc kubenswrapper[4707]: I0129 03:51:23.948117 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 29 03:51:23 crc kubenswrapper[4707]: I0129 03:51:23.948804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" event={"ID":"3b1ed1fd-6748-40d9-b458-ca63cb4479e0","Type":"ContainerStarted","Data":"cfb970dc206e238cacf49fb2d2c08aafad34791b26a22743f85bd8757017402b"} Jan 29 03:51:23 crc kubenswrapper[4707]: I0129 03:51:23.950649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"688208b9-5567-4a47-9ec9-76ce03ec8991","Type":"ContainerStarted","Data":"af2152530575bf02f1dcfe8878a4f6252b803ec3881235c1f444e3f006108b2b"} Jan 29 03:51:23 crc kubenswrapper[4707]: I0129 03:51:23.950930 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:51:23 crc kubenswrapper[4707]: I0129 03:51:23.981127 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.981106675 podStartE2EDuration="40.981106675s" podCreationTimestamp="2026-01-29 03:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:51:23.976913212 +0000 UTC m=+1437.461142117" watchObservedRunningTime="2026-01-29 03:51:23.981106675 +0000 UTC m=+1437.465335580" Jan 29 03:51:24 crc kubenswrapper[4707]: I0129 03:51:24.005435 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.005338057 podStartE2EDuration="39.005338057s" podCreationTimestamp="2026-01-29 03:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 03:51:23.996112986 +0000 UTC m=+1437.480341911" watchObservedRunningTime="2026-01-29 03:51:24.005338057 +0000 UTC m=+1437.489566962" Jan 29 03:51:24 crc kubenswrapper[4707]: I0129 03:51:24.018872 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" podStartSLOduration=2.166285772 podStartE2EDuration="10.018849543s" podCreationTimestamp="2026-01-29 03:51:14 +0000 UTC" firstStartedPulling="2026-01-29 03:51:15.500313814 +0000 UTC m=+1428.984542719" lastFinishedPulling="2026-01-29 03:51:23.352877585 +0000 UTC m=+1436.837106490" observedRunningTime="2026-01-29 03:51:24.012352832 +0000 UTC m=+1437.496581747" watchObservedRunningTime="2026-01-29 03:51:24.018849543 +0000 UTC m=+1437.503078448" Jan 29 03:51:33 crc kubenswrapper[4707]: I0129 03:51:33.881413 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="fb587ed3-9015-4748-a28b-10d4132ffdfb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.227:5671: connect: connection refused" Jan 29 03:51:34 crc kubenswrapper[4707]: I0129 03:51:34.062985 4707 generic.go:334] "Generic (PLEG): container finished" podID="3b1ed1fd-6748-40d9-b458-ca63cb4479e0" containerID="cfb970dc206e238cacf49fb2d2c08aafad34791b26a22743f85bd8757017402b" exitCode=0 Jan 29 03:51:34 crc kubenswrapper[4707]: I0129 03:51:34.063073 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" event={"ID":"3b1ed1fd-6748-40d9-b458-ca63cb4479e0","Type":"ContainerDied","Data":"cfb970dc206e238cacf49fb2d2c08aafad34791b26a22743f85bd8757017402b"} Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.578633 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.672064 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-ssh-key-openstack-edpm-ipam\") pod \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.672205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-inventory\") pod \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.672442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r62kp\" (UniqueName: \"kubernetes.io/projected/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-kube-api-access-r62kp\") pod \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.672616 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-repo-setup-combined-ca-bundle\") pod \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\" (UID: \"3b1ed1fd-6748-40d9-b458-ca63cb4479e0\") " Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.680216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-kube-api-access-r62kp" (OuterVolumeSpecName: "kube-api-access-r62kp") pod "3b1ed1fd-6748-40d9-b458-ca63cb4479e0" (UID: "3b1ed1fd-6748-40d9-b458-ca63cb4479e0"). InnerVolumeSpecName "kube-api-access-r62kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.686331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3b1ed1fd-6748-40d9-b458-ca63cb4479e0" (UID: "3b1ed1fd-6748-40d9-b458-ca63cb4479e0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.711177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-inventory" (OuterVolumeSpecName: "inventory") pod "3b1ed1fd-6748-40d9-b458-ca63cb4479e0" (UID: "3b1ed1fd-6748-40d9-b458-ca63cb4479e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.712355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3b1ed1fd-6748-40d9-b458-ca63cb4479e0" (UID: "3b1ed1fd-6748-40d9-b458-ca63cb4479e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.774800 4707 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.774861 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.774879 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:35 crc kubenswrapper[4707]: I0129 03:51:35.774893 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r62kp\" (UniqueName: \"kubernetes.io/projected/3b1ed1fd-6748-40d9-b458-ca63cb4479e0-kube-api-access-r62kp\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.007901 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.097021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" event={"ID":"3b1ed1fd-6748-40d9-b458-ca63cb4479e0","Type":"ContainerDied","Data":"ee5cce2ab37574b29111217de66e6c6909422866a5da8161197ddc83ee1d1c50"} Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.097065 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee5cce2ab37574b29111217de66e6c6909422866a5da8161197ddc83ee1d1c50" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.099027 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.189823 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9"] Jan 29 03:51:36 crc kubenswrapper[4707]: E0129 03:51:36.191028 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1ed1fd-6748-40d9-b458-ca63cb4479e0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.191182 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1ed1fd-6748-40d9-b458-ca63cb4479e0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.191496 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1ed1fd-6748-40d9-b458-ca63cb4479e0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.192328 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.194391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.199185 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.199331 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.199807 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.210844 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9"] Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.286465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5nrs9\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.286509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5nrs9\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.286657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khlmg\" (UniqueName: \"kubernetes.io/projected/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-kube-api-access-khlmg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5nrs9\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.389065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khlmg\" (UniqueName: \"kubernetes.io/projected/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-kube-api-access-khlmg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5nrs9\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.389867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5nrs9\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.389900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5nrs9\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.394419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5nrs9\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.394897 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5nrs9\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.409621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khlmg\" (UniqueName: \"kubernetes.io/projected/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-kube-api-access-khlmg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5nrs9\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.474304 4707 scope.go:117] "RemoveContainer" containerID="64e6665ac05c704158abd73a0de9a85e1c5b623376a0fa745e63b5c66d3ad44b" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.514863 4707 scope.go:117] "RemoveContainer" containerID="8a2b86e2615e61784ad39188cb18d1b8e30c792668a4fd33763356ee7f7d4260" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.523439 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:36 crc kubenswrapper[4707]: I0129 03:51:36.576949 4707 scope.go:117] "RemoveContainer" containerID="97a0567baefc6d5145483c8d2933ed942b1feedf6734609729fa79e60e6c56dd" Jan 29 03:51:37 crc kubenswrapper[4707]: I0129 03:51:37.195477 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9"] Jan 29 03:51:37 crc kubenswrapper[4707]: W0129 03:51:37.198251 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b03e702_6a8f_4bcc_8be0_4ec0eaf53900.slice/crio-be8896b21d4ed378938de8fddbccdd14add4ab6565a45d32360d48e7fdead8eb WatchSource:0}: Error finding container be8896b21d4ed378938de8fddbccdd14add4ab6565a45d32360d48e7fdead8eb: Status 404 returned error can't find the container with id be8896b21d4ed378938de8fddbccdd14add4ab6565a45d32360d48e7fdead8eb Jan 29 03:51:38 crc kubenswrapper[4707]: I0129 03:51:38.123119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" event={"ID":"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900","Type":"ContainerStarted","Data":"49f243e1c8b9c5a594ea7d30e45f16cbc1303b34188d89ee83f6b72fe9b8239c"} Jan 29 03:51:38 crc kubenswrapper[4707]: I0129 03:51:38.123487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" event={"ID":"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900","Type":"ContainerStarted","Data":"be8896b21d4ed378938de8fddbccdd14add4ab6565a45d32360d48e7fdead8eb"} Jan 29 03:51:38 crc kubenswrapper[4707]: I0129 03:51:38.160945 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" podStartSLOduration=1.618844593 podStartE2EDuration="2.160928115s" podCreationTimestamp="2026-01-29 03:51:36 +0000 UTC" firstStartedPulling="2026-01-29 03:51:37.205259214 +0000 UTC m=+1450.689488119" lastFinishedPulling="2026-01-29 03:51:37.747342746 +0000 UTC m=+1451.231571641" observedRunningTime="2026-01-29 03:51:38.150617343 +0000 UTC m=+1451.634846248" watchObservedRunningTime="2026-01-29 03:51:38.160928115 +0000 UTC m=+1451.645157020" Jan 29 03:51:41 crc kubenswrapper[4707]: I0129 03:51:41.165926 4707 generic.go:334] "Generic (PLEG): container finished" podID="7b03e702-6a8f-4bcc-8be0-4ec0eaf53900" containerID="49f243e1c8b9c5a594ea7d30e45f16cbc1303b34188d89ee83f6b72fe9b8239c" exitCode=0 Jan 29 03:51:41 crc kubenswrapper[4707]: I0129 03:51:41.165985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" event={"ID":"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900","Type":"ContainerDied","Data":"49f243e1c8b9c5a594ea7d30e45f16cbc1303b34188d89ee83f6b72fe9b8239c"} Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.729024 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.792749 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-inventory\") pod \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.793235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khlmg\" (UniqueName: \"kubernetes.io/projected/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-kube-api-access-khlmg\") pod \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.793319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-ssh-key-openstack-edpm-ipam\") pod \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\" (UID: \"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900\") " Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.841952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-kube-api-access-khlmg" (OuterVolumeSpecName: "kube-api-access-khlmg") pod "7b03e702-6a8f-4bcc-8be0-4ec0eaf53900" (UID: "7b03e702-6a8f-4bcc-8be0-4ec0eaf53900"). InnerVolumeSpecName "kube-api-access-khlmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.870707 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7b03e702-6a8f-4bcc-8be0-4ec0eaf53900" (UID: "7b03e702-6a8f-4bcc-8be0-4ec0eaf53900"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.875699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-inventory" (OuterVolumeSpecName: "inventory") pod "7b03e702-6a8f-4bcc-8be0-4ec0eaf53900" (UID: "7b03e702-6a8f-4bcc-8be0-4ec0eaf53900"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.898123 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khlmg\" (UniqueName: \"kubernetes.io/projected/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-kube-api-access-khlmg\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.898171 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:42 crc kubenswrapper[4707]: I0129 03:51:42.898188 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b03e702-6a8f-4bcc-8be0-4ec0eaf53900-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.190134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" event={"ID":"7b03e702-6a8f-4bcc-8be0-4ec0eaf53900","Type":"ContainerDied","Data":"be8896b21d4ed378938de8fddbccdd14add4ab6565a45d32360d48e7fdead8eb"} Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.190787 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8896b21d4ed378938de8fddbccdd14add4ab6565a45d32360d48e7fdead8eb" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.190232 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5nrs9" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.286590 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz"] Jan 29 03:51:43 crc kubenswrapper[4707]: E0129 03:51:43.287042 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b03e702-6a8f-4bcc-8be0-4ec0eaf53900" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.287062 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b03e702-6a8f-4bcc-8be0-4ec0eaf53900" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.287259 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b03e702-6a8f-4bcc-8be0-4ec0eaf53900" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.287983 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.289967 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.290337 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.290837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.291271 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.314126 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz"] Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.412255 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.412376 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.414445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69gb9\" (UniqueName: \"kubernetes.io/projected/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-kube-api-access-69gb9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.414605 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.517301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69gb9\" (UniqueName: \"kubernetes.io/projected/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-kube-api-access-69gb9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.517373 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.517484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.517526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.522960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.525223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.525685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.549421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69gb9\" (UniqueName: \"kubernetes.io/projected/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-kube-api-access-69gb9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.610652 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:51:43 crc kubenswrapper[4707]: I0129 03:51:43.882733 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 29 03:51:44 crc kubenswrapper[4707]: I0129 03:51:44.265660 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz"] Jan 29 03:51:45 crc kubenswrapper[4707]: I0129 03:51:45.278906 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" podStartSLOduration=1.897003724 podStartE2EDuration="2.278873453s" podCreationTimestamp="2026-01-29 03:51:43 +0000 UTC" firstStartedPulling="2026-01-29 03:51:44.249044365 +0000 UTC m=+1457.733273270" lastFinishedPulling="2026-01-29 03:51:44.630914094 +0000 UTC m=+1458.115142999" observedRunningTime="2026-01-29 03:51:45.274470504 +0000 UTC m=+1458.758699419" watchObservedRunningTime="2026-01-29 03:51:45.278873453 +0000 UTC m=+1458.763102348" Jan 29 03:51:45 crc kubenswrapper[4707]: I0129 03:51:45.280895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" event={"ID":"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd","Type":"ContainerStarted","Data":"95e5503a500c492e43ca0ee454fc03aef84a41a46824327b8cf318d436b7a200"} Jan 29 03:51:45 crc kubenswrapper[4707]: I0129 03:51:45.281284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" event={"ID":"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd","Type":"ContainerStarted","Data":"322c17631e48a1a379d9dc12827c88d3b9bbdc83f88d2666e4ef27b8b6b160d5"} Jan 29 03:52:03 crc kubenswrapper[4707]: I0129 03:52:03.464018 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:52:03 crc kubenswrapper[4707]: I0129 03:52:03.465103 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:52:33 crc kubenswrapper[4707]: I0129 03:52:33.463790 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:52:33 crc kubenswrapper[4707]: I0129 03:52:33.465006 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:52:36 crc kubenswrapper[4707]: I0129 03:52:36.877956 4707 scope.go:117] "RemoveContainer" containerID="870e2315d75415233d0c33e4c9d9bb6fccd5be82d0c4ae21b9404170147f8962" Jan 29 03:52:36 crc kubenswrapper[4707]: I0129 03:52:36.905700 4707 scope.go:117] "RemoveContainer" containerID="8df1c720e0202184135b4823a79e815fd92bc89c379158fee4f43c11bfaeea8b" Jan 29 03:52:36 crc kubenswrapper[4707]: I0129 03:52:36.996019 4707 scope.go:117] "RemoveContainer" containerID="7485bbd2f333b34fd80b590f767a1eab7984596b3169b18b67d25cead81e02a7" Jan 29 03:52:37 crc kubenswrapper[4707]: I0129 03:52:37.015793 4707 scope.go:117] "RemoveContainer" containerID="0d69e262c34236f1538a9551f8682600000c85c5359856fdd0a4eb2a49ffc6e1" Jan 29 03:53:03 crc kubenswrapper[4707]: I0129 03:53:03.463587 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 03:53:03 crc kubenswrapper[4707]: I0129 03:53:03.464295 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 03:53:03 crc kubenswrapper[4707]: I0129 03:53:03.464360 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 03:53:03 crc kubenswrapper[4707]: I0129 03:53:03.465358 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 03:53:03 crc kubenswrapper[4707]: I0129 03:53:03.465411 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" gracePeriod=600 Jan 29 03:53:03 crc kubenswrapper[4707]: E0129 03:53:03.600412 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:53:04 crc kubenswrapper[4707]: I0129 03:53:04.098986 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" exitCode=0 Jan 29 03:53:04 crc kubenswrapper[4707]: I0129 03:53:04.099103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204"} Jan 29 03:53:04 crc kubenswrapper[4707]: I0129 03:53:04.099340 4707 scope.go:117] "RemoveContainer" containerID="cf0424c462083ab78d6f7ecb618cb72c194589faa9e4ad5d9079c9263ae5a027" Jan 29 03:53:04 crc kubenswrapper[4707]: I0129 03:53:04.100478 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:53:04 crc kubenswrapper[4707]: E0129 03:53:04.100970 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:53:19 crc kubenswrapper[4707]: I0129 03:53:19.245047 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:53:19 crc kubenswrapper[4707]: E0129 03:53:19.246391 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:53:20 crc kubenswrapper[4707]: I0129 03:53:20.947462 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m5sbt"] Jan 29 03:53:20 crc kubenswrapper[4707]: I0129 03:53:20.950154 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:20 crc kubenswrapper[4707]: I0129 03:53:20.969910 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5sbt"] Jan 29 03:53:20 crc kubenswrapper[4707]: I0129 03:53:20.989510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-utilities\") pod \"redhat-marketplace-m5sbt\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:20 crc kubenswrapper[4707]: I0129 03:53:20.989626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-catalog-content\") pod \"redhat-marketplace-m5sbt\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:20 crc kubenswrapper[4707]: I0129 03:53:20.989663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4txt\" (UniqueName: \"kubernetes.io/projected/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-kube-api-access-d4txt\") pod \"redhat-marketplace-m5sbt\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:21 crc kubenswrapper[4707]: I0129 03:53:21.091773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-utilities\") pod \"redhat-marketplace-m5sbt\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:21 crc kubenswrapper[4707]: I0129 03:53:21.091867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-catalog-content\") pod \"redhat-marketplace-m5sbt\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:21 crc kubenswrapper[4707]: I0129 03:53:21.091903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4txt\" (UniqueName: \"kubernetes.io/projected/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-kube-api-access-d4txt\") pod \"redhat-marketplace-m5sbt\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:21 crc kubenswrapper[4707]: I0129 03:53:21.092894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-utilities\") pod \"redhat-marketplace-m5sbt\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:21 crc kubenswrapper[4707]: I0129 03:53:21.093115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-catalog-content\") pod \"redhat-marketplace-m5sbt\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:21 crc kubenswrapper[4707]: I0129 03:53:21.115348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4txt\" (UniqueName: \"kubernetes.io/projected/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-kube-api-access-d4txt\") pod \"redhat-marketplace-m5sbt\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:21 crc kubenswrapper[4707]: I0129 03:53:21.281420 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:21 crc kubenswrapper[4707]: I0129 03:53:21.836299 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5sbt"] Jan 29 03:53:22 crc kubenswrapper[4707]: I0129 03:53:22.286987 4707 generic.go:334] "Generic (PLEG): container finished" podID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerID="70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01" exitCode=0 Jan 29 03:53:22 crc kubenswrapper[4707]: I0129 03:53:22.287109 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5sbt" event={"ID":"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa","Type":"ContainerDied","Data":"70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01"} Jan 29 03:53:22 crc kubenswrapper[4707]: I0129 03:53:22.287452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5sbt" event={"ID":"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa","Type":"ContainerStarted","Data":"77539650342ffd1e64936bd3c13355fc2b2dc41ed97b1662afae313dc31aad76"} Jan 29 03:53:24 crc kubenswrapper[4707]: I0129 03:53:24.311069 4707 generic.go:334] "Generic (PLEG): container finished" podID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerID="5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e" exitCode=0 Jan 29 03:53:24 crc kubenswrapper[4707]: I0129 03:53:24.311137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5sbt" event={"ID":"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa","Type":"ContainerDied","Data":"5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e"} Jan 29 03:53:25 crc kubenswrapper[4707]: I0129 03:53:25.339602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5sbt" event={"ID":"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa","Type":"ContainerStarted","Data":"108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6"} Jan 29 03:53:25 crc kubenswrapper[4707]: I0129 03:53:25.370030 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m5sbt" podStartSLOduration=2.8473890060000002 podStartE2EDuration="5.370008145s" podCreationTimestamp="2026-01-29 03:53:20 +0000 UTC" firstStartedPulling="2026-01-29 03:53:22.291275526 +0000 UTC m=+1555.775504421" lastFinishedPulling="2026-01-29 03:53:24.813894655 +0000 UTC m=+1558.298123560" observedRunningTime="2026-01-29 03:53:25.358501249 +0000 UTC m=+1558.842730154" watchObservedRunningTime="2026-01-29 03:53:25.370008145 +0000 UTC m=+1558.854237040" Jan 29 03:53:30 crc kubenswrapper[4707]: I0129 03:53:30.244553 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:53:30 crc kubenswrapper[4707]: E0129 03:53:30.245358 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:53:31 crc kubenswrapper[4707]: I0129 03:53:31.282389 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:31 crc kubenswrapper[4707]: I0129 03:53:31.282441 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:31 crc kubenswrapper[4707]: I0129 03:53:31.336001 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:31 crc kubenswrapper[4707]: I0129 03:53:31.440723 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:31 crc kubenswrapper[4707]: I0129 03:53:31.574806 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5sbt"] Jan 29 03:53:33 crc kubenswrapper[4707]: I0129 03:53:33.410320 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m5sbt" podUID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerName="registry-server" containerID="cri-o://108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6" gracePeriod=2 Jan 29 03:53:33 crc kubenswrapper[4707]: I0129 03:53:33.944675 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.096304 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-catalog-content\") pod \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.096392 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-utilities\") pod \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.096471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4txt\" (UniqueName: \"kubernetes.io/projected/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-kube-api-access-d4txt\") pod \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\" (UID: \"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa\") " Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.097219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-utilities" (OuterVolumeSpecName: "utilities") pod "5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" (UID: "5aede09a-325f-4d6c-bf2c-17b90ed2e6fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.103813 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-kube-api-access-d4txt" (OuterVolumeSpecName: "kube-api-access-d4txt") pod "5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" (UID: "5aede09a-325f-4d6c-bf2c-17b90ed2e6fa"). InnerVolumeSpecName "kube-api-access-d4txt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.124188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" (UID: "5aede09a-325f-4d6c-bf2c-17b90ed2e6fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.199791 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.199841 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.199856 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4txt\" (UniqueName: \"kubernetes.io/projected/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa-kube-api-access-d4txt\") on node \"crc\" DevicePath \"\"" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.422903 4707 generic.go:334] "Generic (PLEG): container finished" podID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerID="108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6" exitCode=0 Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.422967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5sbt" event={"ID":"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa","Type":"ContainerDied","Data":"108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6"} Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.422990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5sbt" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.423016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5sbt" event={"ID":"5aede09a-325f-4d6c-bf2c-17b90ed2e6fa","Type":"ContainerDied","Data":"77539650342ffd1e64936bd3c13355fc2b2dc41ed97b1662afae313dc31aad76"} Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.423041 4707 scope.go:117] "RemoveContainer" containerID="108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.453193 4707 scope.go:117] "RemoveContainer" containerID="5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.468772 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5sbt"] Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.482438 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5sbt"] Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.496510 4707 scope.go:117] "RemoveContainer" containerID="70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.539878 4707 scope.go:117] "RemoveContainer" containerID="108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6" Jan 29 03:53:34 crc kubenswrapper[4707]: E0129 03:53:34.540383 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6\": container with ID starting with 108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6 not found: ID does not exist" containerID="108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.540453 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6"} err="failed to get container status \"108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6\": rpc error: code = NotFound desc = could not find container \"108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6\": container with ID starting with 108faf237a645900202a040d8e70c0404e4b1f99f45171991d784710769bc6d6 not found: ID does not exist" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.540490 4707 scope.go:117] "RemoveContainer" containerID="5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e" Jan 29 03:53:34 crc kubenswrapper[4707]: E0129 03:53:34.541204 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e\": container with ID starting with 5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e not found: ID does not exist" containerID="5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.541253 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e"} err="failed to get container status \"5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e\": rpc error: code = NotFound desc = could not find container \"5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e\": container with ID starting with 5afbb049fdd4b082a4d0a15508e10199d4a495231334dd6f8e4847fa2ede264e not found: ID does not exist" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.541287 4707 scope.go:117] "RemoveContainer" containerID="70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01" Jan 29 03:53:34 crc kubenswrapper[4707]: E0129 03:53:34.541870 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01\": container with ID starting with 70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01 not found: ID does not exist" containerID="70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01" Jan 29 03:53:34 crc kubenswrapper[4707]: I0129 03:53:34.541906 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01"} err="failed to get container status \"70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01\": rpc error: code = NotFound desc = could not find container \"70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01\": container with ID starting with 70ddfce72f422db2bb0a0370a84cd9cf34b623513e76a9dd3c77f34e6c1bfe01 not found: ID does not exist" Jan 29 03:53:35 crc kubenswrapper[4707]: I0129 03:53:35.273626 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" path="/var/lib/kubelet/pods/5aede09a-325f-4d6c-bf2c-17b90ed2e6fa/volumes" Jan 29 03:53:37 crc kubenswrapper[4707]: I0129 03:53:37.145248 4707 scope.go:117] "RemoveContainer" containerID="575f6d164bfcdce937417b5d5e8e692dae47e61e20ebca2e68051b469d4975c8" Jan 29 03:53:37 crc kubenswrapper[4707]: I0129 03:53:37.173430 4707 scope.go:117] "RemoveContainer" containerID="d49da7e672e09ce87908fdf2e4bdd2652a410c735912013b3f4ad88f08a97ca8" Jan 29 03:53:37 crc kubenswrapper[4707]: I0129 03:53:37.208458 4707 scope.go:117] "RemoveContainer" containerID="679cbfdc14bb3d07c8bfe13fc180dc5b68454c4c640b96b3614ea951896cee7d" Jan 29 03:53:37 crc kubenswrapper[4707]: I0129 03:53:37.262817 4707 scope.go:117] "RemoveContainer" containerID="5da4aca2b346c268062806371ccff5c4ef359dd215e0ac019744dc7bc7333680" Jan 29 03:53:37 crc kubenswrapper[4707]: I0129 03:53:37.287813 4707 scope.go:117] "RemoveContainer" containerID="a61a8c84b6b80ff80b6088aedac76a006021d4d1797b4c1f84fa01f85dfc71d4" Jan 29 03:53:42 crc kubenswrapper[4707]: I0129 03:53:42.244613 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:53:42 crc kubenswrapper[4707]: E0129 03:53:42.245746 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:53:55 crc kubenswrapper[4707]: I0129 03:53:55.244256 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:53:55 crc kubenswrapper[4707]: E0129 03:53:55.245598 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:54:09 crc kubenswrapper[4707]: I0129 03:54:09.243749 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:54:09 crc kubenswrapper[4707]: E0129 03:54:09.244513 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:54:22 crc kubenswrapper[4707]: I0129 03:54:22.244604 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:54:22 crc kubenswrapper[4707]: E0129 03:54:22.245615 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.292835 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dr555"] Jan 29 03:54:30 crc kubenswrapper[4707]: E0129 03:54:30.298584 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerName="extract-content" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.298610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerName="extract-content" Jan 29 03:54:30 crc kubenswrapper[4707]: E0129 03:54:30.298642 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerName="registry-server" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.298651 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerName="registry-server" Jan 29 03:54:30 crc kubenswrapper[4707]: E0129 03:54:30.298675 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerName="extract-utilities" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.298684 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerName="extract-utilities" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.298920 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aede09a-325f-4d6c-bf2c-17b90ed2e6fa" containerName="registry-server" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.300805 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.306351 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dr555"] Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.474708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5v27\" (UniqueName: \"kubernetes.io/projected/960da531-5c37-480f-aecb-c04ff0b779c3-kube-api-access-n5v27\") pod \"certified-operators-dr555\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.474781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-utilities\") pod \"certified-operators-dr555\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.475049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-catalog-content\") pod \"certified-operators-dr555\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.577490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5v27\" (UniqueName: \"kubernetes.io/projected/960da531-5c37-480f-aecb-c04ff0b779c3-kube-api-access-n5v27\") pod \"certified-operators-dr555\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.577591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-utilities\") pod \"certified-operators-dr555\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.577828 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-catalog-content\") pod \"certified-operators-dr555\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.578075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-utilities\") pod \"certified-operators-dr555\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.578443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-catalog-content\") pod \"certified-operators-dr555\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.605600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5v27\" (UniqueName: \"kubernetes.io/projected/960da531-5c37-480f-aecb-c04ff0b779c3-kube-api-access-n5v27\") pod \"certified-operators-dr555\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:30 crc kubenswrapper[4707]: I0129 03:54:30.638094 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:31 crc kubenswrapper[4707]: I0129 03:54:31.150407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dr555"] Jan 29 03:54:32 crc kubenswrapper[4707]: I0129 03:54:32.034929 4707 generic.go:334] "Generic (PLEG): container finished" podID="960da531-5c37-480f-aecb-c04ff0b779c3" containerID="1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648" exitCode=0 Jan 29 03:54:32 crc kubenswrapper[4707]: I0129 03:54:32.035107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr555" event={"ID":"960da531-5c37-480f-aecb-c04ff0b779c3","Type":"ContainerDied","Data":"1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648"} Jan 29 03:54:32 crc kubenswrapper[4707]: I0129 03:54:32.035302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr555" event={"ID":"960da531-5c37-480f-aecb-c04ff0b779c3","Type":"ContainerStarted","Data":"9bb8b630d37341ae39942acee7bc21b99c9e2f2370e7dac8cfc891ce4b5410e2"} Jan 29 03:54:32 crc kubenswrapper[4707]: I0129 03:54:32.036986 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 03:54:33 crc kubenswrapper[4707]: I0129 03:54:33.047462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr555" event={"ID":"960da531-5c37-480f-aecb-c04ff0b779c3","Type":"ContainerStarted","Data":"cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3"} Jan 29 03:54:34 crc kubenswrapper[4707]: I0129 03:54:34.060058 4707 generic.go:334] "Generic (PLEG): container finished" podID="960da531-5c37-480f-aecb-c04ff0b779c3" containerID="cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3" exitCode=0 Jan 29 03:54:34 crc kubenswrapper[4707]: I0129 03:54:34.060111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr555" event={"ID":"960da531-5c37-480f-aecb-c04ff0b779c3","Type":"ContainerDied","Data":"cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3"} Jan 29 03:54:34 crc kubenswrapper[4707]: I0129 03:54:34.060864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr555" event={"ID":"960da531-5c37-480f-aecb-c04ff0b779c3","Type":"ContainerStarted","Data":"b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b"} Jan 29 03:54:34 crc kubenswrapper[4707]: I0129 03:54:34.094250 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dr555" podStartSLOduration=2.664948258 podStartE2EDuration="4.094227773s" podCreationTimestamp="2026-01-29 03:54:30 +0000 UTC" firstStartedPulling="2026-01-29 03:54:32.0367146 +0000 UTC m=+1625.520943505" lastFinishedPulling="2026-01-29 03:54:33.465994095 +0000 UTC m=+1626.950223020" observedRunningTime="2026-01-29 03:54:34.078186688 +0000 UTC m=+1627.562415673" watchObservedRunningTime="2026-01-29 03:54:34.094227773 +0000 UTC m=+1627.578456698" Jan 29 03:54:37 crc kubenswrapper[4707]: I0129 03:54:37.243478 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:54:37 crc kubenswrapper[4707]: E0129 03:54:37.244212 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:54:40 crc kubenswrapper[4707]: I0129 03:54:40.638613 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:40 crc kubenswrapper[4707]: I0129 03:54:40.639245 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:40 crc kubenswrapper[4707]: I0129 03:54:40.691842 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:41 crc kubenswrapper[4707]: I0129 03:54:41.168382 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:41 crc kubenswrapper[4707]: I0129 03:54:41.227717 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dr555"] Jan 29 03:54:43 crc kubenswrapper[4707]: I0129 03:54:43.142709 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dr555" podUID="960da531-5c37-480f-aecb-c04ff0b779c3" containerName="registry-server" containerID="cri-o://b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b" gracePeriod=2 Jan 29 03:54:43 crc kubenswrapper[4707]: I0129 03:54:43.658998 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:43 crc kubenswrapper[4707]: I0129 03:54:43.761323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-catalog-content\") pod \"960da531-5c37-480f-aecb-c04ff0b779c3\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " Jan 29 03:54:43 crc kubenswrapper[4707]: I0129 03:54:43.761506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-utilities\") pod \"960da531-5c37-480f-aecb-c04ff0b779c3\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " Jan 29 03:54:43 crc kubenswrapper[4707]: I0129 03:54:43.761636 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5v27\" (UniqueName: \"kubernetes.io/projected/960da531-5c37-480f-aecb-c04ff0b779c3-kube-api-access-n5v27\") pod \"960da531-5c37-480f-aecb-c04ff0b779c3\" (UID: \"960da531-5c37-480f-aecb-c04ff0b779c3\") " Jan 29 03:54:43 crc kubenswrapper[4707]: I0129 03:54:43.763582 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-utilities" (OuterVolumeSpecName: "utilities") pod "960da531-5c37-480f-aecb-c04ff0b779c3" (UID: "960da531-5c37-480f-aecb-c04ff0b779c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:54:43 crc kubenswrapper[4707]: I0129 03:54:43.770514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/960da531-5c37-480f-aecb-c04ff0b779c3-kube-api-access-n5v27" (OuterVolumeSpecName: "kube-api-access-n5v27") pod "960da531-5c37-480f-aecb-c04ff0b779c3" (UID: "960da531-5c37-480f-aecb-c04ff0b779c3"). InnerVolumeSpecName "kube-api-access-n5v27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:54:43 crc kubenswrapper[4707]: I0129 03:54:43.865282 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 03:54:43 crc kubenswrapper[4707]: I0129 03:54:43.865339 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5v27\" (UniqueName: \"kubernetes.io/projected/960da531-5c37-480f-aecb-c04ff0b779c3-kube-api-access-n5v27\") on node \"crc\" DevicePath \"\"" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.154127 4707 generic.go:334] "Generic (PLEG): container finished" podID="960da531-5c37-480f-aecb-c04ff0b779c3" containerID="b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b" exitCode=0 Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.154198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr555" event={"ID":"960da531-5c37-480f-aecb-c04ff0b779c3","Type":"ContainerDied","Data":"b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b"} Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.154220 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dr555" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.154245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr555" event={"ID":"960da531-5c37-480f-aecb-c04ff0b779c3","Type":"ContainerDied","Data":"9bb8b630d37341ae39942acee7bc21b99c9e2f2370e7dac8cfc891ce4b5410e2"} Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.154267 4707 scope.go:117] "RemoveContainer" containerID="b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.175430 4707 scope.go:117] "RemoveContainer" containerID="cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.205051 4707 scope.go:117] "RemoveContainer" containerID="1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.237494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "960da531-5c37-480f-aecb-c04ff0b779c3" (UID: "960da531-5c37-480f-aecb-c04ff0b779c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.245317 4707 scope.go:117] "RemoveContainer" containerID="b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b" Jan 29 03:54:44 crc kubenswrapper[4707]: E0129 03:54:44.245768 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b\": container with ID starting with b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b not found: ID does not exist" containerID="b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.245799 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b"} err="failed to get container status \"b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b\": rpc error: code = NotFound desc = could not find container \"b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b\": container with ID starting with b7a8b973f47585105ccb2cbb3c01939d495db819b03443c9bd44e9d5f324008b not found: ID does not exist" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.245824 4707 scope.go:117] "RemoveContainer" containerID="cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3" Jan 29 03:54:44 crc kubenswrapper[4707]: E0129 03:54:44.246295 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3\": container with ID starting with cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3 not found: ID does not exist" containerID="cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.246318 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3"} err="failed to get container status \"cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3\": rpc error: code = NotFound desc = could not find container \"cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3\": container with ID starting with cbf05212bd4cb81c16682ea69d66d304d1f58b3f1625afa263503aaf29b6a8a3 not found: ID does not exist" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.246332 4707 scope.go:117] "RemoveContainer" containerID="1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648" Jan 29 03:54:44 crc kubenswrapper[4707]: E0129 03:54:44.247372 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648\": container with ID starting with 1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648 not found: ID does not exist" containerID="1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.247451 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648"} err="failed to get container status \"1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648\": rpc error: code = NotFound desc = could not find container \"1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648\": container with ID starting with 1f309da81164d7e799c6c16f7fa88278c90c8f8c93b95a1cc9134cd3a8c0c648 not found: ID does not exist" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.272352 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/960da531-5c37-480f-aecb-c04ff0b779c3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.488139 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dr555"] Jan 29 03:54:44 crc kubenswrapper[4707]: I0129 03:54:44.497986 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dr555"] Jan 29 03:54:45 crc kubenswrapper[4707]: I0129 03:54:45.260033 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="960da531-5c37-480f-aecb-c04ff0b779c3" path="/var/lib/kubelet/pods/960da531-5c37-480f-aecb-c04ff0b779c3/volumes" Jan 29 03:54:50 crc kubenswrapper[4707]: I0129 03:54:50.244174 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:54:50 crc kubenswrapper[4707]: E0129 03:54:50.245441 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:54:52 crc kubenswrapper[4707]: I0129 03:54:52.242085 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ed3ca47-cf57-4534-b12c-2aa6c2be26cd" containerID="95e5503a500c492e43ca0ee454fc03aef84a41a46824327b8cf318d436b7a200" exitCode=0 Jan 29 03:54:52 crc kubenswrapper[4707]: I0129 03:54:52.242178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" event={"ID":"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd","Type":"ContainerDied","Data":"95e5503a500c492e43ca0ee454fc03aef84a41a46824327b8cf318d436b7a200"} Jan 29 03:54:53 crc kubenswrapper[4707]: I0129 03:54:53.738026 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:54:53 crc kubenswrapper[4707]: I0129 03:54:53.890967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-bootstrap-combined-ca-bundle\") pod \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " Jan 29 03:54:53 crc kubenswrapper[4707]: I0129 03:54:53.891036 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-ssh-key-openstack-edpm-ipam\") pod \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " Jan 29 03:54:53 crc kubenswrapper[4707]: I0129 03:54:53.891261 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-inventory\") pod \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " Jan 29 03:54:53 crc kubenswrapper[4707]: I0129 03:54:53.891311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69gb9\" (UniqueName: \"kubernetes.io/projected/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-kube-api-access-69gb9\") pod \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\" (UID: \"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd\") " Jan 29 03:54:53 crc kubenswrapper[4707]: I0129 03:54:53.897366 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-kube-api-access-69gb9" (OuterVolumeSpecName: "kube-api-access-69gb9") pod "4ed3ca47-cf57-4534-b12c-2aa6c2be26cd" (UID: "4ed3ca47-cf57-4534-b12c-2aa6c2be26cd"). InnerVolumeSpecName "kube-api-access-69gb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:54:53 crc kubenswrapper[4707]: I0129 03:54:53.912862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4ed3ca47-cf57-4534-b12c-2aa6c2be26cd" (UID: "4ed3ca47-cf57-4534-b12c-2aa6c2be26cd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:54:53 crc kubenswrapper[4707]: I0129 03:54:53.925666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4ed3ca47-cf57-4534-b12c-2aa6c2be26cd" (UID: "4ed3ca47-cf57-4534-b12c-2aa6c2be26cd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:54:53 crc kubenswrapper[4707]: I0129 03:54:53.931616 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-inventory" (OuterVolumeSpecName: "inventory") pod "4ed3ca47-cf57-4534-b12c-2aa6c2be26cd" (UID: "4ed3ca47-cf57-4534-b12c-2aa6c2be26cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.000190 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.000407 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69gb9\" (UniqueName: \"kubernetes.io/projected/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-kube-api-access-69gb9\") on node \"crc\" DevicePath \"\"" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.000671 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.000700 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed3ca47-cf57-4534-b12c-2aa6c2be26cd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.261409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" event={"ID":"4ed3ca47-cf57-4534-b12c-2aa6c2be26cd","Type":"ContainerDied","Data":"322c17631e48a1a379d9dc12827c88d3b9bbdc83f88d2666e4ef27b8b6b160d5"} Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.261736 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="322c17631e48a1a379d9dc12827c88d3b9bbdc83f88d2666e4ef27b8b6b160d5" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.261488 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.350234 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt"] Jan 29 03:54:54 crc kubenswrapper[4707]: E0129 03:54:54.350711 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="960da531-5c37-480f-aecb-c04ff0b779c3" containerName="extract-content" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.350728 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="960da531-5c37-480f-aecb-c04ff0b779c3" containerName="extract-content" Jan 29 03:54:54 crc kubenswrapper[4707]: E0129 03:54:54.350750 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="960da531-5c37-480f-aecb-c04ff0b779c3" containerName="extract-utilities" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.350757 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="960da531-5c37-480f-aecb-c04ff0b779c3" containerName="extract-utilities" Jan 29 03:54:54 crc kubenswrapper[4707]: E0129 03:54:54.350773 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed3ca47-cf57-4534-b12c-2aa6c2be26cd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.350780 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed3ca47-cf57-4534-b12c-2aa6c2be26cd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 29 03:54:54 crc kubenswrapper[4707]: E0129 03:54:54.350791 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="960da531-5c37-480f-aecb-c04ff0b779c3" containerName="registry-server" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.350797 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="960da531-5c37-480f-aecb-c04ff0b779c3" containerName="registry-server" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.350981 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed3ca47-cf57-4534-b12c-2aa6c2be26cd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.351001 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="960da531-5c37-480f-aecb-c04ff0b779c3" containerName="registry-server" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.351678 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.354663 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.354948 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.354978 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.362211 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt"] Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.362605 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.510843 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.510971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzlzp\" (UniqueName: \"kubernetes.io/projected/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-kube-api-access-tzlzp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.511012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.612926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.613064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzlzp\" (UniqueName: \"kubernetes.io/projected/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-kube-api-access-tzlzp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.613118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.618058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.622027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.629268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzlzp\" (UniqueName: \"kubernetes.io/projected/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-kube-api-access-tzlzp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:54 crc kubenswrapper[4707]: I0129 03:54:54.684023 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:54:55 crc kubenswrapper[4707]: I0129 03:54:55.223496 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt"] Jan 29 03:54:55 crc kubenswrapper[4707]: I0129 03:54:55.274917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" event={"ID":"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7","Type":"ContainerStarted","Data":"0f1c10ac226557465a35c3321a745eb7e07c4ae23b302f866f3206781c615f87"} Jan 29 03:54:56 crc kubenswrapper[4707]: I0129 03:54:56.284036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" event={"ID":"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7","Type":"ContainerStarted","Data":"c156f0a8a750d68ccd60a3bebd18b620683626bfe39dd876a5fc28b7ad62a68d"} Jan 29 03:54:56 crc kubenswrapper[4707]: I0129 03:54:56.299860 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" podStartSLOduration=1.8873978550000001 podStartE2EDuration="2.299839326s" podCreationTimestamp="2026-01-29 03:54:54 +0000 UTC" firstStartedPulling="2026-01-29 03:54:55.229110293 +0000 UTC m=+1648.713339228" lastFinishedPulling="2026-01-29 03:54:55.641551794 +0000 UTC m=+1649.125780699" observedRunningTime="2026-01-29 03:54:56.298498878 +0000 UTC m=+1649.782727793" watchObservedRunningTime="2026-01-29 03:54:56.299839326 +0000 UTC m=+1649.784068221" Jan 29 03:55:05 crc kubenswrapper[4707]: I0129 03:55:05.243702 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:55:05 crc kubenswrapper[4707]: E0129 03:55:05.244481 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:55:12 crc kubenswrapper[4707]: I0129 03:55:12.071255 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-66c3-account-create-update-gzzxt"] Jan 29 03:55:12 crc kubenswrapper[4707]: I0129 03:55:12.087100 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-brvzb"] Jan 29 03:55:12 crc kubenswrapper[4707]: I0129 03:55:12.097409 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0e72-account-create-update-shtrl"] Jan 29 03:55:12 crc kubenswrapper[4707]: I0129 03:55:12.111275 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hxjqt"] Jan 29 03:55:12 crc kubenswrapper[4707]: I0129 03:55:12.127587 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0e72-account-create-update-shtrl"] Jan 29 03:55:12 crc kubenswrapper[4707]: I0129 03:55:12.141324 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-brvzb"] Jan 29 03:55:12 crc kubenswrapper[4707]: I0129 03:55:12.153409 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-66c3-account-create-update-gzzxt"] Jan 29 03:55:12 crc kubenswrapper[4707]: I0129 03:55:12.167315 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hxjqt"] Jan 29 03:55:13 crc kubenswrapper[4707]: I0129 03:55:13.254184 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cea5dce-4548-46ba-b461-8c98a63a7daf" path="/var/lib/kubelet/pods/3cea5dce-4548-46ba-b461-8c98a63a7daf/volumes" Jan 29 03:55:13 crc kubenswrapper[4707]: I0129 03:55:13.254792 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba70b464-3262-4a27-b710-6ec145fc1a8f" path="/var/lib/kubelet/pods/ba70b464-3262-4a27-b710-6ec145fc1a8f/volumes" Jan 29 03:55:13 crc kubenswrapper[4707]: I0129 03:55:13.255390 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0826415-3b19-4838-b645-1d5e36ba6e16" path="/var/lib/kubelet/pods/c0826415-3b19-4838-b645-1d5e36ba6e16/volumes" Jan 29 03:55:13 crc kubenswrapper[4707]: I0129 03:55:13.255950 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb2a1269-a03b-4a5f-b61e-ad1f94576f9e" path="/var/lib/kubelet/pods/cb2a1269-a03b-4a5f-b61e-ad1f94576f9e/volumes" Jan 29 03:55:17 crc kubenswrapper[4707]: I0129 03:55:17.038864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f123-account-create-update-9n45l"] Jan 29 03:55:17 crc kubenswrapper[4707]: I0129 03:55:17.051454 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2tdqh"] Jan 29 03:55:17 crc kubenswrapper[4707]: I0129 03:55:17.064730 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2tdqh"] Jan 29 03:55:17 crc kubenswrapper[4707]: I0129 03:55:17.073361 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f123-account-create-update-9n45l"] Jan 29 03:55:17 crc kubenswrapper[4707]: I0129 03:55:17.264081 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88180d5b-94a0-4a62-9d77-5679c91e5d07" path="/var/lib/kubelet/pods/88180d5b-94a0-4a62-9d77-5679c91e5d07/volumes" Jan 29 03:55:17 crc kubenswrapper[4707]: I0129 03:55:17.265457 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdac9d85-fc1b-4556-8d11-6d43a6b24753" path="/var/lib/kubelet/pods/cdac9d85-fc1b-4556-8d11-6d43a6b24753/volumes" Jan 29 03:55:20 crc kubenswrapper[4707]: I0129 03:55:20.243497 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:55:20 crc kubenswrapper[4707]: E0129 03:55:20.244756 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:55:31 crc kubenswrapper[4707]: I0129 03:55:31.244211 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:55:31 crc kubenswrapper[4707]: E0129 03:55:31.246919 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.430160 4707 scope.go:117] "RemoveContainer" containerID="46cc6c097bb1ec5c99a016d57ff85959148ea67be60e7fb165d9b2d1a56d1bef" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.505530 4707 scope.go:117] "RemoveContainer" containerID="1006064bb35e480fe9e60e724004e172a59ea17a7fdbbab215838f92c2a15d3a" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.529062 4707 scope.go:117] "RemoveContainer" containerID="6aa2773ce2855195c8d89e118373e54b6473d11c286b519b57fa479193263ce4" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.576967 4707 scope.go:117] "RemoveContainer" containerID="606e86b23eecaebc92447482e60b7bb8b696cbfcbe09dcba491d207262bf4b4a" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.629024 4707 scope.go:117] "RemoveContainer" containerID="3bbd1c6ecf561f8ae2565973ac9dc34d83ec9e52f04b968f70bb36b68ea9b7e8" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.680453 4707 scope.go:117] "RemoveContainer" containerID="f5e79da3b8d462ac93c347688474fe44bd456d32086b8d4a5830a90ab4325747" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.697265 4707 scope.go:117] "RemoveContainer" containerID="4eb5b45b2d80302bb040d1a4c3b6896389f5ad7a1867de1d8a3b1723b93a8f69" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.751018 4707 scope.go:117] "RemoveContainer" containerID="1e027fad81ca7dfa8a4f749e38b8f6b282f522cfd7b1bb5d8e70b57e221edd89" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.796352 4707 scope.go:117] "RemoveContainer" containerID="ea1fb86af1f4e7c44a74b553e65ed8578eae636a139f6e103fe26a55aa675452" Jan 29 03:55:37 crc kubenswrapper[4707]: I0129 03:55:37.817348 4707 scope.go:117] "RemoveContainer" containerID="ec94449a991d649c47094022ec2cbc29578fe96f14d79940db68fa092ec2d267" Jan 29 03:55:41 crc kubenswrapper[4707]: I0129 03:55:41.084350 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-swv4z"] Jan 29 03:55:41 crc kubenswrapper[4707]: I0129 03:55:41.095164 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-swv4z"] Jan 29 03:55:41 crc kubenswrapper[4707]: I0129 03:55:41.253249 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0046b5a1-ddfb-44d6-9a24-301c0cf61b75" path="/var/lib/kubelet/pods/0046b5a1-ddfb-44d6-9a24-301c0cf61b75/volumes" Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.029636 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-62c6-account-create-update-z9fv5"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.040240 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-trkf9"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.049262 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d1f5-account-create-update-7p7sh"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.059004 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-66fxr"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.091429 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-6wzp8"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.101236 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-66fxr"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.114460 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d1f5-account-create-update-7p7sh"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.124388 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-62c6-account-create-update-z9fv5"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.133054 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-6wzp8"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.141467 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-trkf9"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.149636 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e746-account-create-update-s9k4v"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.158022 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e746-account-create-update-s9k4v"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.166898 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c73b-account-create-update-fvw6z"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.176474 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c73b-account-create-update-fvw6z"] Jan 29 03:55:42 crc kubenswrapper[4707]: I0129 03:55:42.243583 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:55:42 crc kubenswrapper[4707]: E0129 03:55:42.244057 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:55:43 crc kubenswrapper[4707]: I0129 03:55:43.253818 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e10248-9205-4bb2-be05-60aa2647f447" path="/var/lib/kubelet/pods/01e10248-9205-4bb2-be05-60aa2647f447/volumes" Jan 29 03:55:43 crc kubenswrapper[4707]: I0129 03:55:43.254378 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db7e1d7-27c8-4f26-9288-fb94302ba13b" path="/var/lib/kubelet/pods/2db7e1d7-27c8-4f26-9288-fb94302ba13b/volumes" Jan 29 03:55:43 crc kubenswrapper[4707]: I0129 03:55:43.255008 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a74e51-b46b-4d96-ba78-5073504fb9c5" path="/var/lib/kubelet/pods/32a74e51-b46b-4d96-ba78-5073504fb9c5/volumes" Jan 29 03:55:43 crc kubenswrapper[4707]: I0129 03:55:43.255519 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0c5c1a-19a4-475f-a810-71feb6ff1d5f" path="/var/lib/kubelet/pods/9d0c5c1a-19a4-475f-a810-71feb6ff1d5f/volumes" Jan 29 03:55:43 crc kubenswrapper[4707]: I0129 03:55:43.256834 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5" path="/var/lib/kubelet/pods/cf4b89d8-90c5-4ac0-a4e0-4c4d8bffbcc5/volumes" Jan 29 03:55:43 crc kubenswrapper[4707]: I0129 03:55:43.257445 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0521623-7e83-4fec-b9b1-3414ae979a0d" path="/var/lib/kubelet/pods/f0521623-7e83-4fec-b9b1-3414ae979a0d/volumes" Jan 29 03:55:43 crc kubenswrapper[4707]: I0129 03:55:43.258036 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4324488-aeee-4bcb-b62b-8b238db04a68" path="/var/lib/kubelet/pods/f4324488-aeee-4bcb-b62b-8b238db04a68/volumes" Jan 29 03:55:47 crc kubenswrapper[4707]: I0129 03:55:47.034271 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-m74xr"] Jan 29 03:55:47 crc kubenswrapper[4707]: I0129 03:55:47.047402 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-m74xr"] Jan 29 03:55:47 crc kubenswrapper[4707]: I0129 03:55:47.253690 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65e0ca5-1e58-4492-bd8d-92ff6d516014" path="/var/lib/kubelet/pods/d65e0ca5-1e58-4492-bd8d-92ff6d516014/volumes" Jan 29 03:55:54 crc kubenswrapper[4707]: I0129 03:55:54.032564 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wwvj2"] Jan 29 03:55:54 crc kubenswrapper[4707]: I0129 03:55:54.041068 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wwvj2"] Jan 29 03:55:54 crc kubenswrapper[4707]: I0129 03:55:54.243740 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:55:54 crc kubenswrapper[4707]: E0129 03:55:54.244020 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:55:55 crc kubenswrapper[4707]: I0129 03:55:55.253975 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f945e029-2a96-43ab-93aa-556eeadfda35" path="/var/lib/kubelet/pods/f945e029-2a96-43ab-93aa-556eeadfda35/volumes" Jan 29 03:56:08 crc kubenswrapper[4707]: I0129 03:56:08.243357 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:56:08 crc kubenswrapper[4707]: E0129 03:56:08.244164 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:56:18 crc kubenswrapper[4707]: I0129 03:56:18.075368 4707 generic.go:334] "Generic (PLEG): container finished" podID="43b1dffd-18d4-4201-9a3f-5ef4db33c8b7" containerID="c156f0a8a750d68ccd60a3bebd18b620683626bfe39dd876a5fc28b7ad62a68d" exitCode=0 Jan 29 03:56:18 crc kubenswrapper[4707]: I0129 03:56:18.075456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" event={"ID":"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7","Type":"ContainerDied","Data":"c156f0a8a750d68ccd60a3bebd18b620683626bfe39dd876a5fc28b7ad62a68d"} Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.584391 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.779932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzlzp\" (UniqueName: \"kubernetes.io/projected/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-kube-api-access-tzlzp\") pod \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.780044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-inventory\") pod \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.780386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-ssh-key-openstack-edpm-ipam\") pod \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\" (UID: \"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7\") " Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.789062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-kube-api-access-tzlzp" (OuterVolumeSpecName: "kube-api-access-tzlzp") pod "43b1dffd-18d4-4201-9a3f-5ef4db33c8b7" (UID: "43b1dffd-18d4-4201-9a3f-5ef4db33c8b7"). InnerVolumeSpecName "kube-api-access-tzlzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.808322 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-inventory" (OuterVolumeSpecName: "inventory") pod "43b1dffd-18d4-4201-9a3f-5ef4db33c8b7" (UID: "43b1dffd-18d4-4201-9a3f-5ef4db33c8b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.808847 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "43b1dffd-18d4-4201-9a3f-5ef4db33c8b7" (UID: "43b1dffd-18d4-4201-9a3f-5ef4db33c8b7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.884639 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.884706 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzlzp\" (UniqueName: \"kubernetes.io/projected/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-kube-api-access-tzlzp\") on node \"crc\" DevicePath \"\"" Jan 29 03:56:19 crc kubenswrapper[4707]: I0129 03:56:19.884720 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b1dffd-18d4-4201-9a3f-5ef4db33c8b7-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.097065 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" event={"ID":"43b1dffd-18d4-4201-9a3f-5ef4db33c8b7","Type":"ContainerDied","Data":"0f1c10ac226557465a35c3321a745eb7e07c4ae23b302f866f3206781c615f87"} Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.097117 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f1c10ac226557465a35c3321a745eb7e07c4ae23b302f866f3206781c615f87" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.097090 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.180226 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7"] Jan 29 03:56:20 crc kubenswrapper[4707]: E0129 03:56:20.180728 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b1dffd-18d4-4201-9a3f-5ef4db33c8b7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.180749 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b1dffd-18d4-4201-9a3f-5ef4db33c8b7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.180945 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b1dffd-18d4-4201-9a3f-5ef4db33c8b7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.181639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.183477 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.183592 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.183918 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.184675 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.189256 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7"] Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.191370 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.191454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.191490 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw5j5\" (UniqueName: \"kubernetes.io/projected/98ca67a4-de19-4954-a988-1c743df160cd-kube-api-access-hw5j5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.245132 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:56:20 crc kubenswrapper[4707]: E0129 03:56:20.245410 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.293375 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.293498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.293554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw5j5\" (UniqueName: \"kubernetes.io/projected/98ca67a4-de19-4954-a988-1c743df160cd-kube-api-access-hw5j5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.305267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.306793 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.311010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw5j5\" (UniqueName: \"kubernetes.io/projected/98ca67a4-de19-4954-a988-1c743df160cd-kube-api-access-hw5j5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:20 crc kubenswrapper[4707]: I0129 03:56:20.496236 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:56:21 crc kubenswrapper[4707]: I0129 03:56:21.079586 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7"] Jan 29 03:56:21 crc kubenswrapper[4707]: I0129 03:56:21.106814 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" event={"ID":"98ca67a4-de19-4954-a988-1c743df160cd","Type":"ContainerStarted","Data":"acc059bcb0fd6d6f82d11ce5c459c9db349ba18bcc01263f2261b1dfab1be7c3"} Jan 29 03:56:22 crc kubenswrapper[4707]: I0129 03:56:22.152722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" event={"ID":"98ca67a4-de19-4954-a988-1c743df160cd","Type":"ContainerStarted","Data":"9bf7bcbd0de682f42f2471b214553954e83609a427e8cfb078afa9b88c3d7a79"} Jan 29 03:56:22 crc kubenswrapper[4707]: I0129 03:56:22.192365 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" podStartSLOduration=1.754795824 podStartE2EDuration="2.192340548s" podCreationTimestamp="2026-01-29 03:56:20 +0000 UTC" firstStartedPulling="2026-01-29 03:56:21.080201599 +0000 UTC m=+1734.564430504" lastFinishedPulling="2026-01-29 03:56:21.517746323 +0000 UTC m=+1735.001975228" observedRunningTime="2026-01-29 03:56:22.181630744 +0000 UTC m=+1735.665859679" watchObservedRunningTime="2026-01-29 03:56:22.192340548 +0000 UTC m=+1735.676569463" Jan 29 03:56:26 crc kubenswrapper[4707]: I0129 03:56:26.054007 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-2hmmz"] Jan 29 03:56:26 crc kubenswrapper[4707]: I0129 03:56:26.064198 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-2hmmz"] Jan 29 03:56:27 crc kubenswrapper[4707]: I0129 03:56:27.276952 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42399f3f-f8c5-45dc-b192-b3a8997c4636" path="/var/lib/kubelet/pods/42399f3f-f8c5-45dc-b192-b3a8997c4636/volumes" Jan 29 03:56:29 crc kubenswrapper[4707]: I0129 03:56:29.027770 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5k88w"] Jan 29 03:56:29 crc kubenswrapper[4707]: I0129 03:56:29.037142 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5k88w"] Jan 29 03:56:29 crc kubenswrapper[4707]: I0129 03:56:29.261426 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae136de8-5472-4668-9856-3b7b45942c99" path="/var/lib/kubelet/pods/ae136de8-5472-4668-9856-3b7b45942c99/volumes" Jan 29 03:56:31 crc kubenswrapper[4707]: I0129 03:56:31.243897 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:56:31 crc kubenswrapper[4707]: E0129 03:56:31.244513 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:56:36 crc kubenswrapper[4707]: I0129 03:56:36.033092 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8tm9r"] Jan 29 03:56:36 crc kubenswrapper[4707]: I0129 03:56:36.043152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gjzwn"] Jan 29 03:56:36 crc kubenswrapper[4707]: I0129 03:56:36.054222 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8tm9r"] Jan 29 03:56:36 crc kubenswrapper[4707]: I0129 03:56:36.064587 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-9gbdt"] Jan 29 03:56:36 crc kubenswrapper[4707]: I0129 03:56:36.073476 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gjzwn"] Jan 29 03:56:36 crc kubenswrapper[4707]: I0129 03:56:36.082757 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-9gbdt"] Jan 29 03:56:37 crc kubenswrapper[4707]: I0129 03:56:37.254193 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a2cf721-9a8b-49ab-9e57-1337f407db4f" path="/var/lib/kubelet/pods/3a2cf721-9a8b-49ab-9e57-1337f407db4f/volumes" Jan 29 03:56:37 crc kubenswrapper[4707]: I0129 03:56:37.255269 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3fb185b-2bb2-4cc2-8572-b38db5027edb" path="/var/lib/kubelet/pods/c3fb185b-2bb2-4cc2-8572-b38db5027edb/volumes" Jan 29 03:56:37 crc kubenswrapper[4707]: I0129 03:56:37.255829 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f326dfe8-582f-44e7-9030-8bbfbf4ccb68" path="/var/lib/kubelet/pods/f326dfe8-582f-44e7-9030-8bbfbf4ccb68/volumes" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.001880 4707 scope.go:117] "RemoveContainer" containerID="d1002857d91bbb4e78112591a44e36f2bef70dfc80b0978c0dd542ac0bbf2e1e" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.285200 4707 scope.go:117] "RemoveContainer" containerID="03a9c90b6df5078686ed567815aaaa0464859b7022a274fbf4a95ee49428e46f" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.315167 4707 scope.go:117] "RemoveContainer" containerID="d7bc57e63ee48165733e5e5556190fa9ad2e5fa814ab960b965446b905cdf70c" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.389571 4707 scope.go:117] "RemoveContainer" containerID="00c45d457cee63fe9383969a0ffe0a8b523bcecdf1d1ad3985daaadbb025260c" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.434152 4707 scope.go:117] "RemoveContainer" containerID="55e11662761f02f30e62ca9478f8b30cbc625cf1d1af2799412407c9f3e2c87d" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.478240 4707 scope.go:117] "RemoveContainer" containerID="9e15fd15b9d508d8e4354452e5ca76ed5444a58e8db9397d7377afccee8c451f" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.524469 4707 scope.go:117] "RemoveContainer" containerID="3d2fee166692052d0ce695e9235b93bbcd303489695f1256d8032871a784429c" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.550290 4707 scope.go:117] "RemoveContainer" containerID="e863074f4f2d6fa39c0994f88ffdaee2b48fbba4e27705032f9821c96cf8d356" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.572747 4707 scope.go:117] "RemoveContainer" containerID="98ae48325409a0d42880eb8cb061055c35df5a5c804948408ca6406a72c8e2cc" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.596826 4707 scope.go:117] "RemoveContainer" containerID="6aeba6acbc0e96892797cb8e103a6dabb18bf651bac1a97c4c4a80df45932d74" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.620910 4707 scope.go:117] "RemoveContainer" containerID="ef158d4a57db53f20fb9ca8511e7e57575fc87180dad99e905fbb24a24e2340f" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.641932 4707 scope.go:117] "RemoveContainer" containerID="096d748d230f8d85731c5abf7119a97caba232a8e5fa7ab9ecddcad3da9da62c" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.680561 4707 scope.go:117] "RemoveContainer" containerID="a8160b81dabed31cb475030f365e6ea1de1347a88ad2958c908250de93b3e35c" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.705422 4707 scope.go:117] "RemoveContainer" containerID="d94c998a7d7df8bac0a22de419445fed83c8d7899f59d768dc82fc00977ce505" Jan 29 03:56:38 crc kubenswrapper[4707]: I0129 03:56:38.753277 4707 scope.go:117] "RemoveContainer" containerID="9c5d58d279b3629677204620cb901b5983d157b881a20e83d92529f049ad3b78" Jan 29 03:56:42 crc kubenswrapper[4707]: I0129 03:56:42.244698 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:56:42 crc kubenswrapper[4707]: E0129 03:56:42.245434 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:56:49 crc kubenswrapper[4707]: I0129 03:56:49.041302 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-wbtsz"] Jan 29 03:56:49 crc kubenswrapper[4707]: I0129 03:56:49.051407 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-wbtsz"] Jan 29 03:56:49 crc kubenswrapper[4707]: I0129 03:56:49.261998 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6808d614-6634-4b2a-9e78-7480a0921415" path="/var/lib/kubelet/pods/6808d614-6634-4b2a-9e78-7480a0921415/volumes" Jan 29 03:56:50 crc kubenswrapper[4707]: I0129 03:56:50.035687 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gzpxq"] Jan 29 03:56:50 crc kubenswrapper[4707]: I0129 03:56:50.047923 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gzpxq"] Jan 29 03:56:51 crc kubenswrapper[4707]: I0129 03:56:51.257812 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b80fc8-a8ca-417d-9f86-d4fb86587f3a" path="/var/lib/kubelet/pods/e6b80fc8-a8ca-417d-9f86-d4fb86587f3a/volumes" Jan 29 03:56:56 crc kubenswrapper[4707]: I0129 03:56:56.244252 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:56:56 crc kubenswrapper[4707]: E0129 03:56:56.245166 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:57:09 crc kubenswrapper[4707]: I0129 03:57:09.244285 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:57:09 crc kubenswrapper[4707]: E0129 03:57:09.245183 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:57:22 crc kubenswrapper[4707]: I0129 03:57:22.244810 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:57:22 crc kubenswrapper[4707]: E0129 03:57:22.246239 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:57:25 crc kubenswrapper[4707]: I0129 03:57:25.734379 4707 generic.go:334] "Generic (PLEG): container finished" podID="98ca67a4-de19-4954-a988-1c743df160cd" containerID="9bf7bcbd0de682f42f2471b214553954e83609a427e8cfb078afa9b88c3d7a79" exitCode=0 Jan 29 03:57:25 crc kubenswrapper[4707]: I0129 03:57:25.734893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" event={"ID":"98ca67a4-de19-4954-a988-1c743df160cd","Type":"ContainerDied","Data":"9bf7bcbd0de682f42f2471b214553954e83609a427e8cfb078afa9b88c3d7a79"} Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.176462 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.309986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-ssh-key-openstack-edpm-ipam\") pod \"98ca67a4-de19-4954-a988-1c743df160cd\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.310414 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-inventory\") pod \"98ca67a4-de19-4954-a988-1c743df160cd\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.310672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw5j5\" (UniqueName: \"kubernetes.io/projected/98ca67a4-de19-4954-a988-1c743df160cd-kube-api-access-hw5j5\") pod \"98ca67a4-de19-4954-a988-1c743df160cd\" (UID: \"98ca67a4-de19-4954-a988-1c743df160cd\") " Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.316191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ca67a4-de19-4954-a988-1c743df160cd-kube-api-access-hw5j5" (OuterVolumeSpecName: "kube-api-access-hw5j5") pod "98ca67a4-de19-4954-a988-1c743df160cd" (UID: "98ca67a4-de19-4954-a988-1c743df160cd"). InnerVolumeSpecName "kube-api-access-hw5j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.337813 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "98ca67a4-de19-4954-a988-1c743df160cd" (UID: "98ca67a4-de19-4954-a988-1c743df160cd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.338693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-inventory" (OuterVolumeSpecName: "inventory") pod "98ca67a4-de19-4954-a988-1c743df160cd" (UID: "98ca67a4-de19-4954-a988-1c743df160cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.413183 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.413278 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98ca67a4-de19-4954-a988-1c743df160cd-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.413290 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw5j5\" (UniqueName: \"kubernetes.io/projected/98ca67a4-de19-4954-a988-1c743df160cd-kube-api-access-hw5j5\") on node \"crc\" DevicePath \"\"" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.761941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" event={"ID":"98ca67a4-de19-4954-a988-1c743df160cd","Type":"ContainerDied","Data":"acc059bcb0fd6d6f82d11ce5c459c9db349ba18bcc01263f2261b1dfab1be7c3"} Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.761987 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc059bcb0fd6d6f82d11ce5c459c9db349ba18bcc01263f2261b1dfab1be7c3" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.762044 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.834867 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt"] Jan 29 03:57:27 crc kubenswrapper[4707]: E0129 03:57:27.835387 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ca67a4-de19-4954-a988-1c743df160cd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.835412 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ca67a4-de19-4954-a988-1c743df160cd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.835690 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="98ca67a4-de19-4954-a988-1c743df160cd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.836506 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.839065 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.839451 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.839858 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.842019 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.844597 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt"] Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.923508 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-892tt\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.923608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldgwj\" (UniqueName: \"kubernetes.io/projected/bf895266-d1e5-47d5-8d3a-397acefb3f9b-kube-api-access-ldgwj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-892tt\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:27 crc kubenswrapper[4707]: I0129 03:57:27.923806 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-892tt\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:28 crc kubenswrapper[4707]: I0129 03:57:28.025936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-892tt\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:28 crc kubenswrapper[4707]: I0129 03:57:28.026043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-892tt\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:28 crc kubenswrapper[4707]: I0129 03:57:28.026094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldgwj\" (UniqueName: \"kubernetes.io/projected/bf895266-d1e5-47d5-8d3a-397acefb3f9b-kube-api-access-ldgwj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-892tt\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:28 crc kubenswrapper[4707]: I0129 03:57:28.030497 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-892tt\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:28 crc kubenswrapper[4707]: I0129 03:57:28.031607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-892tt\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:28 crc kubenswrapper[4707]: I0129 03:57:28.045179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldgwj\" (UniqueName: \"kubernetes.io/projected/bf895266-d1e5-47d5-8d3a-397acefb3f9b-kube-api-access-ldgwj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-892tt\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:28 crc kubenswrapper[4707]: I0129 03:57:28.153139 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:28 crc kubenswrapper[4707]: I0129 03:57:28.667856 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt"] Jan 29 03:57:28 crc kubenswrapper[4707]: I0129 03:57:28.773139 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" event={"ID":"bf895266-d1e5-47d5-8d3a-397acefb3f9b","Type":"ContainerStarted","Data":"10cc07e57cbe770268afee6a3ce4b3c722d807e684c3a64ff29ceb3b8fc4de66"} Jan 29 03:57:29 crc kubenswrapper[4707]: I0129 03:57:29.798291 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" event={"ID":"bf895266-d1e5-47d5-8d3a-397acefb3f9b","Type":"ContainerStarted","Data":"130b614c031cbf8e885f284903003381a341a9503e75724f0bffa86b5c97aed0"} Jan 29 03:57:29 crc kubenswrapper[4707]: I0129 03:57:29.819682 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" podStartSLOduration=2.106382253 podStartE2EDuration="2.819664596s" podCreationTimestamp="2026-01-29 03:57:27 +0000 UTC" firstStartedPulling="2026-01-29 03:57:28.673114932 +0000 UTC m=+1802.157343837" lastFinishedPulling="2026-01-29 03:57:29.386397275 +0000 UTC m=+1802.870626180" observedRunningTime="2026-01-29 03:57:29.815846749 +0000 UTC m=+1803.300075644" watchObservedRunningTime="2026-01-29 03:57:29.819664596 +0000 UTC m=+1803.303893501" Jan 29 03:57:33 crc kubenswrapper[4707]: I0129 03:57:33.039863 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ptdvm"] Jan 29 03:57:33 crc kubenswrapper[4707]: I0129 03:57:33.051726 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5wxq8"] Jan 29 03:57:33 crc kubenswrapper[4707]: I0129 03:57:33.062900 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5wxq8"] Jan 29 03:57:33 crc kubenswrapper[4707]: I0129 03:57:33.073457 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ptdvm"] Jan 29 03:57:33 crc kubenswrapper[4707]: I0129 03:57:33.253491 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b79725c-fee3-4f6f-84b7-c0dbd52e75ce" path="/var/lib/kubelet/pods/7b79725c-fee3-4f6f-84b7-c0dbd52e75ce/volumes" Jan 29 03:57:33 crc kubenswrapper[4707]: I0129 03:57:33.254406 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6d4aa3-52da-4273-8dae-1ac01656cab9" path="/var/lib/kubelet/pods/fa6d4aa3-52da-4273-8dae-1ac01656cab9/volumes" Jan 29 03:57:34 crc kubenswrapper[4707]: I0129 03:57:34.033558 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-bbea-account-create-update-jcvzk"] Jan 29 03:57:34 crc kubenswrapper[4707]: I0129 03:57:34.045717 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3f4f-account-create-update-jdxff"] Jan 29 03:57:34 crc kubenswrapper[4707]: I0129 03:57:34.053855 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-bbea-account-create-update-jcvzk"] Jan 29 03:57:34 crc kubenswrapper[4707]: I0129 03:57:34.060307 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3f4f-account-create-update-jdxff"] Jan 29 03:57:34 crc kubenswrapper[4707]: I0129 03:57:34.244521 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:57:34 crc kubenswrapper[4707]: E0129 03:57:34.245108 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:57:34 crc kubenswrapper[4707]: I0129 03:57:34.849053 4707 generic.go:334] "Generic (PLEG): container finished" podID="bf895266-d1e5-47d5-8d3a-397acefb3f9b" containerID="130b614c031cbf8e885f284903003381a341a9503e75724f0bffa86b5c97aed0" exitCode=0 Jan 29 03:57:34 crc kubenswrapper[4707]: I0129 03:57:34.849125 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" event={"ID":"bf895266-d1e5-47d5-8d3a-397acefb3f9b","Type":"ContainerDied","Data":"130b614c031cbf8e885f284903003381a341a9503e75724f0bffa86b5c97aed0"} Jan 29 03:57:35 crc kubenswrapper[4707]: I0129 03:57:35.033204 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-lzdkh"] Jan 29 03:57:35 crc kubenswrapper[4707]: I0129 03:57:35.041303 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-aabf-account-create-update-xhfqq"] Jan 29 03:57:35 crc kubenswrapper[4707]: I0129 03:57:35.049291 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-lzdkh"] Jan 29 03:57:35 crc kubenswrapper[4707]: I0129 03:57:35.057225 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-aabf-account-create-update-xhfqq"] Jan 29 03:57:35 crc kubenswrapper[4707]: I0129 03:57:35.253605 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64" path="/var/lib/kubelet/pods/0cf1fde7-13f9-40cb-a3f7-ed20d76b0e64/volumes" Jan 29 03:57:35 crc kubenswrapper[4707]: I0129 03:57:35.254224 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d368b24-5c6c-4828-af36-7d553daeee3c" path="/var/lib/kubelet/pods/3d368b24-5c6c-4828-af36-7d553daeee3c/volumes" Jan 29 03:57:35 crc kubenswrapper[4707]: I0129 03:57:35.254876 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4264d1cc-366e-412c-9f2d-8276c6b5fc70" path="/var/lib/kubelet/pods/4264d1cc-366e-412c-9f2d-8276c6b5fc70/volumes" Jan 29 03:57:35 crc kubenswrapper[4707]: I0129 03:57:35.255396 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b34d90a-4443-41fb-8220-d07465c9faa1" path="/var/lib/kubelet/pods/9b34d90a-4443-41fb-8220-d07465c9faa1/volumes" Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.309728 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.389885 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-ssh-key-openstack-edpm-ipam\") pod \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.389959 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldgwj\" (UniqueName: \"kubernetes.io/projected/bf895266-d1e5-47d5-8d3a-397acefb3f9b-kube-api-access-ldgwj\") pod \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.389991 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-inventory\") pod \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\" (UID: \"bf895266-d1e5-47d5-8d3a-397acefb3f9b\") " Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.397966 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf895266-d1e5-47d5-8d3a-397acefb3f9b-kube-api-access-ldgwj" (OuterVolumeSpecName: "kube-api-access-ldgwj") pod "bf895266-d1e5-47d5-8d3a-397acefb3f9b" (UID: "bf895266-d1e5-47d5-8d3a-397acefb3f9b"). InnerVolumeSpecName "kube-api-access-ldgwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.425478 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf895266-d1e5-47d5-8d3a-397acefb3f9b" (UID: "bf895266-d1e5-47d5-8d3a-397acefb3f9b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.425850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-inventory" (OuterVolumeSpecName: "inventory") pod "bf895266-d1e5-47d5-8d3a-397acefb3f9b" (UID: "bf895266-d1e5-47d5-8d3a-397acefb3f9b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.492973 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.493008 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldgwj\" (UniqueName: \"kubernetes.io/projected/bf895266-d1e5-47d5-8d3a-397acefb3f9b-kube-api-access-ldgwj\") on node \"crc\" DevicePath \"\"" Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.493036 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf895266-d1e5-47d5-8d3a-397acefb3f9b-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.867971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" event={"ID":"bf895266-d1e5-47d5-8d3a-397acefb3f9b","Type":"ContainerDied","Data":"10cc07e57cbe770268afee6a3ce4b3c722d807e684c3a64ff29ceb3b8fc4de66"} Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.868391 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10cc07e57cbe770268afee6a3ce4b3c722d807e684c3a64ff29ceb3b8fc4de66" Jan 29 03:57:36 crc kubenswrapper[4707]: I0129 03:57:36.868649 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-892tt" Jan 29 03:57:37 crc kubenswrapper[4707]: I0129 03:57:37.902805 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42"] Jan 29 03:57:37 crc kubenswrapper[4707]: E0129 03:57:37.903596 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf895266-d1e5-47d5-8d3a-397acefb3f9b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 29 03:57:37 crc kubenswrapper[4707]: I0129 03:57:37.903610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf895266-d1e5-47d5-8d3a-397acefb3f9b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 29 03:57:37 crc kubenswrapper[4707]: I0129 03:57:37.903815 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf895266-d1e5-47d5-8d3a-397acefb3f9b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 29 03:57:37 crc kubenswrapper[4707]: I0129 03:57:37.904432 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:37 crc kubenswrapper[4707]: I0129 03:57:37.906886 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:57:37 crc kubenswrapper[4707]: I0129 03:57:37.907308 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:57:37 crc kubenswrapper[4707]: I0129 03:57:37.907350 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:57:37 crc kubenswrapper[4707]: I0129 03:57:37.907367 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:57:37 crc kubenswrapper[4707]: I0129 03:57:37.927337 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42"] Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.038963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b4b42\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.039017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4lb\" (UniqueName: \"kubernetes.io/projected/1503434a-951a-4e31-836e-c1f37b794d45-kube-api-access-cm4lb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b4b42\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.039066 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b4b42\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.140910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b4b42\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.140963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4lb\" (UniqueName: \"kubernetes.io/projected/1503434a-951a-4e31-836e-c1f37b794d45-kube-api-access-cm4lb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b4b42\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.141016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b4b42\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.146629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b4b42\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.148037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b4b42\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.167984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4lb\" (UniqueName: \"kubernetes.io/projected/1503434a-951a-4e31-836e-c1f37b794d45-kube-api-access-cm4lb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b4b42\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.234343 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.725744 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42"] Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.888760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" event={"ID":"1503434a-951a-4e31-836e-c1f37b794d45","Type":"ContainerStarted","Data":"ae6c007028dc12ba7c7f5e55e13d85deddc3e7931d6be91f183c467bd582472e"} Jan 29 03:57:38 crc kubenswrapper[4707]: I0129 03:57:38.982626 4707 scope.go:117] "RemoveContainer" containerID="db24f942fa5db4ad356378e0559dff3af60fef107a60255ec0a6475caefa47d2" Jan 29 03:57:39 crc kubenswrapper[4707]: I0129 03:57:39.003245 4707 scope.go:117] "RemoveContainer" containerID="8407fb29d0fe3d3fb94280f256eefd1b6681ef7b3ba4a22423a6518dfeffdd18" Jan 29 03:57:39 crc kubenswrapper[4707]: I0129 03:57:39.028290 4707 scope.go:117] "RemoveContainer" containerID="b9d204c3a77405fd52e4a98e1a2adcf4f0ad5d8b9ad58dbfaa1562a18242efb6" Jan 29 03:57:39 crc kubenswrapper[4707]: I0129 03:57:39.059031 4707 scope.go:117] "RemoveContainer" containerID="e802bb1aa8450675870feccb01861e8857fb6fec39300bf58877fb4a677f92f5" Jan 29 03:57:39 crc kubenswrapper[4707]: I0129 03:57:39.078373 4707 scope.go:117] "RemoveContainer" containerID="c94e11f8ba76a194d102eab4b4403b3307d9fe0b9fc8a11ff064dc357716833a" Jan 29 03:57:39 crc kubenswrapper[4707]: I0129 03:57:39.096998 4707 scope.go:117] "RemoveContainer" containerID="c16abc9e22c3d3d572f7c87b2770135394cb1ead6a210a0738b60f95009fa4c1" Jan 29 03:57:39 crc kubenswrapper[4707]: I0129 03:57:39.130523 4707 scope.go:117] "RemoveContainer" containerID="bec773fe3ccd0405d46c8774cb69869fba69c76270893dd022abf7bc1e25a200" Jan 29 03:57:39 crc kubenswrapper[4707]: I0129 03:57:39.148698 4707 scope.go:117] "RemoveContainer" containerID="ce1f1515c142d5cfc720c2d0577fa37fa94df64c46af092314277a817349d2e2" Jan 29 03:57:39 crc kubenswrapper[4707]: I0129 03:57:39.906203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" event={"ID":"1503434a-951a-4e31-836e-c1f37b794d45","Type":"ContainerStarted","Data":"5c49dc7804f1c46a55093244c1795b5db62a4d9975e55838af962158f68efed8"} Jan 29 03:57:39 crc kubenswrapper[4707]: I0129 03:57:39.937323 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" podStartSLOduration=2.092930324 podStartE2EDuration="2.937297785s" podCreationTimestamp="2026-01-29 03:57:37 +0000 UTC" firstStartedPulling="2026-01-29 03:57:38.736695229 +0000 UTC m=+1812.220924134" lastFinishedPulling="2026-01-29 03:57:39.58106267 +0000 UTC m=+1813.065291595" observedRunningTime="2026-01-29 03:57:39.931781561 +0000 UTC m=+1813.416010476" watchObservedRunningTime="2026-01-29 03:57:39.937297785 +0000 UTC m=+1813.421526710" Jan 29 03:57:48 crc kubenswrapper[4707]: I0129 03:57:48.243595 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:57:48 crc kubenswrapper[4707]: E0129 03:57:48.244582 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:57:59 crc kubenswrapper[4707]: I0129 03:57:59.243948 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:57:59 crc kubenswrapper[4707]: E0129 03:57:59.244750 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 03:58:08 crc kubenswrapper[4707]: I0129 03:58:08.038347 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5bzlc"] Jan 29 03:58:08 crc kubenswrapper[4707]: I0129 03:58:08.046028 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5bzlc"] Jan 29 03:58:09 crc kubenswrapper[4707]: I0129 03:58:09.253038 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e0c927-bc87-4eb0-b565-e30b4278331c" path="/var/lib/kubelet/pods/e9e0c927-bc87-4eb0-b565-e30b4278331c/volumes" Jan 29 03:58:12 crc kubenswrapper[4707]: I0129 03:58:12.188185 4707 generic.go:334] "Generic (PLEG): container finished" podID="1503434a-951a-4e31-836e-c1f37b794d45" containerID="5c49dc7804f1c46a55093244c1795b5db62a4d9975e55838af962158f68efed8" exitCode=0 Jan 29 03:58:12 crc kubenswrapper[4707]: I0129 03:58:12.188260 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" event={"ID":"1503434a-951a-4e31-836e-c1f37b794d45","Type":"ContainerDied","Data":"5c49dc7804f1c46a55093244c1795b5db62a4d9975e55838af962158f68efed8"} Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.246617 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.661859 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.701969 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-inventory\") pod \"1503434a-951a-4e31-836e-c1f37b794d45\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.702075 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm4lb\" (UniqueName: \"kubernetes.io/projected/1503434a-951a-4e31-836e-c1f37b794d45-kube-api-access-cm4lb\") pod \"1503434a-951a-4e31-836e-c1f37b794d45\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.702305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-ssh-key-openstack-edpm-ipam\") pod \"1503434a-951a-4e31-836e-c1f37b794d45\" (UID: \"1503434a-951a-4e31-836e-c1f37b794d45\") " Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.709991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1503434a-951a-4e31-836e-c1f37b794d45-kube-api-access-cm4lb" (OuterVolumeSpecName: "kube-api-access-cm4lb") pod "1503434a-951a-4e31-836e-c1f37b794d45" (UID: "1503434a-951a-4e31-836e-c1f37b794d45"). InnerVolumeSpecName "kube-api-access-cm4lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.734154 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-inventory" (OuterVolumeSpecName: "inventory") pod "1503434a-951a-4e31-836e-c1f37b794d45" (UID: "1503434a-951a-4e31-836e-c1f37b794d45"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.738301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1503434a-951a-4e31-836e-c1f37b794d45" (UID: "1503434a-951a-4e31-836e-c1f37b794d45"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.806393 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.806434 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1503434a-951a-4e31-836e-c1f37b794d45-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:58:13 crc kubenswrapper[4707]: I0129 03:58:13.806445 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm4lb\" (UniqueName: \"kubernetes.io/projected/1503434a-951a-4e31-836e-c1f37b794d45-kube-api-access-cm4lb\") on node \"crc\" DevicePath \"\"" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.205642 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.205640 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b4b42" event={"ID":"1503434a-951a-4e31-836e-c1f37b794d45","Type":"ContainerDied","Data":"ae6c007028dc12ba7c7f5e55e13d85deddc3e7931d6be91f183c467bd582472e"} Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.206184 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae6c007028dc12ba7c7f5e55e13d85deddc3e7931d6be91f183c467bd582472e" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.209069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"9075a066370335e7cc1a1ac646b318a6399393468866cff65b48a27e1e3b8714"} Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.400595 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2"] Jan 29 03:58:14 crc kubenswrapper[4707]: E0129 03:58:14.401117 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1503434a-951a-4e31-836e-c1f37b794d45" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.401136 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1503434a-951a-4e31-836e-c1f37b794d45" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.401364 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1503434a-951a-4e31-836e-c1f37b794d45" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.402164 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.407955 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.408260 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.408487 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.408786 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.418328 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2"] Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.518719 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57btw\" (UniqueName: \"kubernetes.io/projected/933d5dc9-d255-45c9-837d-251701e8fd77-kube-api-access-57btw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.518817 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.519033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.621743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.622436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57btw\" (UniqueName: \"kubernetes.io/projected/933d5dc9-d255-45c9-837d-251701e8fd77-kube-api-access-57btw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.622614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.627976 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.635518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.643095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57btw\" (UniqueName: \"kubernetes.io/projected/933d5dc9-d255-45c9-837d-251701e8fd77-kube-api-access-57btw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:14 crc kubenswrapper[4707]: I0129 03:58:14.793798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:58:15 crc kubenswrapper[4707]: I0129 03:58:15.333378 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2"] Jan 29 03:58:16 crc kubenswrapper[4707]: I0129 03:58:16.223861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" event={"ID":"933d5dc9-d255-45c9-837d-251701e8fd77","Type":"ContainerStarted","Data":"34697a3f4afc1f2982df93b05e9ca4d359b2af75725bc17d2ffe36b6a0c58d2d"} Jan 29 03:58:16 crc kubenswrapper[4707]: I0129 03:58:16.224451 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" event={"ID":"933d5dc9-d255-45c9-837d-251701e8fd77","Type":"ContainerStarted","Data":"4378905a44971c55ffe27e1f16768817b11580c57dddcfe3c3fd428e7f88f062"} Jan 29 03:58:16 crc kubenswrapper[4707]: I0129 03:58:16.246504 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" podStartSLOduration=1.776147039 podStartE2EDuration="2.246484087s" podCreationTimestamp="2026-01-29 03:58:14 +0000 UTC" firstStartedPulling="2026-01-29 03:58:15.335428751 +0000 UTC m=+1848.819657656" lastFinishedPulling="2026-01-29 03:58:15.805765789 +0000 UTC m=+1849.289994704" observedRunningTime="2026-01-29 03:58:16.246344273 +0000 UTC m=+1849.730573178" watchObservedRunningTime="2026-01-29 03:58:16.246484087 +0000 UTC m=+1849.730712992" Jan 29 03:58:39 crc kubenswrapper[4707]: I0129 03:58:39.363420 4707 scope.go:117] "RemoveContainer" containerID="86f6e194922acfc714150c39a7ebbecba8151b690f5517f5208282a9daf9d67c" Jan 29 03:59:01 crc kubenswrapper[4707]: I0129 03:59:01.698334 4707 generic.go:334] "Generic (PLEG): container finished" podID="933d5dc9-d255-45c9-837d-251701e8fd77" containerID="34697a3f4afc1f2982df93b05e9ca4d359b2af75725bc17d2ffe36b6a0c58d2d" exitCode=0 Jan 29 03:59:01 crc kubenswrapper[4707]: I0129 03:59:01.698413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" event={"ID":"933d5dc9-d255-45c9-837d-251701e8fd77","Type":"ContainerDied","Data":"34697a3f4afc1f2982df93b05e9ca4d359b2af75725bc17d2ffe36b6a0c58d2d"} Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.125881 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.188942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57btw\" (UniqueName: \"kubernetes.io/projected/933d5dc9-d255-45c9-837d-251701e8fd77-kube-api-access-57btw\") pod \"933d5dc9-d255-45c9-837d-251701e8fd77\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.189283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-inventory\") pod \"933d5dc9-d255-45c9-837d-251701e8fd77\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.189507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-ssh-key-openstack-edpm-ipam\") pod \"933d5dc9-d255-45c9-837d-251701e8fd77\" (UID: \"933d5dc9-d255-45c9-837d-251701e8fd77\") " Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.195795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/933d5dc9-d255-45c9-837d-251701e8fd77-kube-api-access-57btw" (OuterVolumeSpecName: "kube-api-access-57btw") pod "933d5dc9-d255-45c9-837d-251701e8fd77" (UID: "933d5dc9-d255-45c9-837d-251701e8fd77"). InnerVolumeSpecName "kube-api-access-57btw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.217107 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "933d5dc9-d255-45c9-837d-251701e8fd77" (UID: "933d5dc9-d255-45c9-837d-251701e8fd77"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.219895 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-inventory" (OuterVolumeSpecName: "inventory") pod "933d5dc9-d255-45c9-837d-251701e8fd77" (UID: "933d5dc9-d255-45c9-837d-251701e8fd77"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.292513 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.292568 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57btw\" (UniqueName: \"kubernetes.io/projected/933d5dc9-d255-45c9-837d-251701e8fd77-kube-api-access-57btw\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.292586 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/933d5dc9-d255-45c9-837d-251701e8fd77-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.724016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" event={"ID":"933d5dc9-d255-45c9-837d-251701e8fd77","Type":"ContainerDied","Data":"4378905a44971c55ffe27e1f16768817b11580c57dddcfe3c3fd428e7f88f062"} Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.724050 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.724064 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4378905a44971c55ffe27e1f16768817b11580c57dddcfe3c3fd428e7f88f062" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.829561 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r7f6d"] Jan 29 03:59:03 crc kubenswrapper[4707]: E0129 03:59:03.830107 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933d5dc9-d255-45c9-837d-251701e8fd77" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.830128 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="933d5dc9-d255-45c9-837d-251701e8fd77" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.830425 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="933d5dc9-d255-45c9-837d-251701e8fd77" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.831150 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.833886 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.834126 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.836157 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.836658 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.851493 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r7f6d"] Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.911689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r7f6d\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.911759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vc7\" (UniqueName: \"kubernetes.io/projected/c8e32f79-d8b0-424c-a23e-fe94623016de-kube-api-access-v9vc7\") pod \"ssh-known-hosts-edpm-deployment-r7f6d\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:03 crc kubenswrapper[4707]: I0129 03:59:03.911796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r7f6d\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:04 crc kubenswrapper[4707]: I0129 03:59:04.012551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r7f6d\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:04 crc kubenswrapper[4707]: I0129 03:59:04.012618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vc7\" (UniqueName: \"kubernetes.io/projected/c8e32f79-d8b0-424c-a23e-fe94623016de-kube-api-access-v9vc7\") pod \"ssh-known-hosts-edpm-deployment-r7f6d\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:04 crc kubenswrapper[4707]: I0129 03:59:04.012652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r7f6d\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:04 crc kubenswrapper[4707]: I0129 03:59:04.019107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r7f6d\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:04 crc kubenswrapper[4707]: I0129 03:59:04.022075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r7f6d\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:04 crc kubenswrapper[4707]: I0129 03:59:04.030081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vc7\" (UniqueName: \"kubernetes.io/projected/c8e32f79-d8b0-424c-a23e-fe94623016de-kube-api-access-v9vc7\") pod \"ssh-known-hosts-edpm-deployment-r7f6d\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:04 crc kubenswrapper[4707]: I0129 03:59:04.152275 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:04 crc kubenswrapper[4707]: I0129 03:59:04.715559 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r7f6d"] Jan 29 03:59:05 crc kubenswrapper[4707]: I0129 03:59:05.740612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" event={"ID":"c8e32f79-d8b0-424c-a23e-fe94623016de","Type":"ContainerStarted","Data":"82d69ea9cd245539f1b9519462527d8c2db93e705823e954446f2a9ab9a4f5bf"} Jan 29 03:59:05 crc kubenswrapper[4707]: I0129 03:59:05.740969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" event={"ID":"c8e32f79-d8b0-424c-a23e-fe94623016de","Type":"ContainerStarted","Data":"25383fca0c52742306249b695cd742675635fbbfa1ab0d6fafc36f8e523d8d0a"} Jan 29 03:59:05 crc kubenswrapper[4707]: I0129 03:59:05.767705 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" podStartSLOduration=2.339819684 podStartE2EDuration="2.767676563s" podCreationTimestamp="2026-01-29 03:59:03 +0000 UTC" firstStartedPulling="2026-01-29 03:59:04.729259974 +0000 UTC m=+1898.213488879" lastFinishedPulling="2026-01-29 03:59:05.157116853 +0000 UTC m=+1898.641345758" observedRunningTime="2026-01-29 03:59:05.759115703 +0000 UTC m=+1899.243344628" watchObservedRunningTime="2026-01-29 03:59:05.767676563 +0000 UTC m=+1899.251905468" Jan 29 03:59:11 crc kubenswrapper[4707]: I0129 03:59:11.798230 4707 generic.go:334] "Generic (PLEG): container finished" podID="c8e32f79-d8b0-424c-a23e-fe94623016de" containerID="82d69ea9cd245539f1b9519462527d8c2db93e705823e954446f2a9ab9a4f5bf" exitCode=0 Jan 29 03:59:11 crc kubenswrapper[4707]: I0129 03:59:11.798453 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" event={"ID":"c8e32f79-d8b0-424c-a23e-fe94623016de","Type":"ContainerDied","Data":"82d69ea9cd245539f1b9519462527d8c2db93e705823e954446f2a9ab9a4f5bf"} Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.056171 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vm7jk"] Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.066926 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vm7jk"] Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.237801 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.264456 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e0e811-830f-4e5c-a3bb-0dd5be01cf3d" path="/var/lib/kubelet/pods/83e0e811-830f-4e5c-a3bb-0dd5be01cf3d/volumes" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.419687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9vc7\" (UniqueName: \"kubernetes.io/projected/c8e32f79-d8b0-424c-a23e-fe94623016de-kube-api-access-v9vc7\") pod \"c8e32f79-d8b0-424c-a23e-fe94623016de\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.419867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-inventory-0\") pod \"c8e32f79-d8b0-424c-a23e-fe94623016de\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.419947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-ssh-key-openstack-edpm-ipam\") pod \"c8e32f79-d8b0-424c-a23e-fe94623016de\" (UID: \"c8e32f79-d8b0-424c-a23e-fe94623016de\") " Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.425847 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e32f79-d8b0-424c-a23e-fe94623016de-kube-api-access-v9vc7" (OuterVolumeSpecName: "kube-api-access-v9vc7") pod "c8e32f79-d8b0-424c-a23e-fe94623016de" (UID: "c8e32f79-d8b0-424c-a23e-fe94623016de"). InnerVolumeSpecName "kube-api-access-v9vc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.446885 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c8e32f79-d8b0-424c-a23e-fe94623016de" (UID: "c8e32f79-d8b0-424c-a23e-fe94623016de"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.453158 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c8e32f79-d8b0-424c-a23e-fe94623016de" (UID: "c8e32f79-d8b0-424c-a23e-fe94623016de"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.522685 4707 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.522965 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8e32f79-d8b0-424c-a23e-fe94623016de-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.523124 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9vc7\" (UniqueName: \"kubernetes.io/projected/c8e32f79-d8b0-424c-a23e-fe94623016de-kube-api-access-v9vc7\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.817655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" event={"ID":"c8e32f79-d8b0-424c-a23e-fe94623016de","Type":"ContainerDied","Data":"25383fca0c52742306249b695cd742675635fbbfa1ab0d6fafc36f8e523d8d0a"} Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.818000 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25383fca0c52742306249b695cd742675635fbbfa1ab0d6fafc36f8e523d8d0a" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.817722 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r7f6d" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.894756 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6"] Jan 29 03:59:13 crc kubenswrapper[4707]: E0129 03:59:13.895168 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e32f79-d8b0-424c-a23e-fe94623016de" containerName="ssh-known-hosts-edpm-deployment" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.895183 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e32f79-d8b0-424c-a23e-fe94623016de" containerName="ssh-known-hosts-edpm-deployment" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.895401 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e32f79-d8b0-424c-a23e-fe94623016de" containerName="ssh-known-hosts-edpm-deployment" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.896043 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.898616 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.898825 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.900104 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.900687 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.909348 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6"] Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.945791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2t5r\" (UniqueName: \"kubernetes.io/projected/21dfe5ce-4935-46c4-8124-cb4fecf0a906-kube-api-access-h2t5r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ncft6\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.945879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ncft6\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:13 crc kubenswrapper[4707]: I0129 03:59:13.945938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ncft6\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.031231 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vqq2"] Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.040846 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5vqq2"] Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.049101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2t5r\" (UniqueName: \"kubernetes.io/projected/21dfe5ce-4935-46c4-8124-cb4fecf0a906-kube-api-access-h2t5r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ncft6\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.049212 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ncft6\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.049262 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ncft6\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.054750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ncft6\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.056241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ncft6\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.065907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2t5r\" (UniqueName: \"kubernetes.io/projected/21dfe5ce-4935-46c4-8124-cb4fecf0a906-kube-api-access-h2t5r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ncft6\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.257989 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:14 crc kubenswrapper[4707]: I0129 03:59:14.885357 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6"] Jan 29 03:59:15 crc kubenswrapper[4707]: I0129 03:59:15.254243 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ededf3cb-8e51-4ba6-a3d1-dc12980e13d1" path="/var/lib/kubelet/pods/ededf3cb-8e51-4ba6-a3d1-dc12980e13d1/volumes" Jan 29 03:59:15 crc kubenswrapper[4707]: I0129 03:59:15.834421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" event={"ID":"21dfe5ce-4935-46c4-8124-cb4fecf0a906","Type":"ContainerStarted","Data":"b64f78fb1d3cdd0ffa019599123767d453d93732b84c483f2c198a66de78eb13"} Jan 29 03:59:15 crc kubenswrapper[4707]: I0129 03:59:15.834774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" event={"ID":"21dfe5ce-4935-46c4-8124-cb4fecf0a906","Type":"ContainerStarted","Data":"8a139e1277ad2a2d1242522f2a4b59cfd34fec9a8ff2b9e5d64f2238b56454a2"} Jan 29 03:59:15 crc kubenswrapper[4707]: I0129 03:59:15.856973 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" podStartSLOduration=2.440274695 podStartE2EDuration="2.856951071s" podCreationTimestamp="2026-01-29 03:59:13 +0000 UTC" firstStartedPulling="2026-01-29 03:59:14.899226469 +0000 UTC m=+1908.383455374" lastFinishedPulling="2026-01-29 03:59:15.315902845 +0000 UTC m=+1908.800131750" observedRunningTime="2026-01-29 03:59:15.848299869 +0000 UTC m=+1909.332528774" watchObservedRunningTime="2026-01-29 03:59:15.856951071 +0000 UTC m=+1909.341179976" Jan 29 03:59:22 crc kubenswrapper[4707]: I0129 03:59:22.892150 4707 generic.go:334] "Generic (PLEG): container finished" podID="21dfe5ce-4935-46c4-8124-cb4fecf0a906" containerID="b64f78fb1d3cdd0ffa019599123767d453d93732b84c483f2c198a66de78eb13" exitCode=0 Jan 29 03:59:22 crc kubenswrapper[4707]: I0129 03:59:22.892228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" event={"ID":"21dfe5ce-4935-46c4-8124-cb4fecf0a906","Type":"ContainerDied","Data":"b64f78fb1d3cdd0ffa019599123767d453d93732b84c483f2c198a66de78eb13"} Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.353149 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.457119 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-ssh-key-openstack-edpm-ipam\") pod \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.457457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-inventory\") pod \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.457592 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2t5r\" (UniqueName: \"kubernetes.io/projected/21dfe5ce-4935-46c4-8124-cb4fecf0a906-kube-api-access-h2t5r\") pod \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\" (UID: \"21dfe5ce-4935-46c4-8124-cb4fecf0a906\") " Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.463729 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21dfe5ce-4935-46c4-8124-cb4fecf0a906-kube-api-access-h2t5r" (OuterVolumeSpecName: "kube-api-access-h2t5r") pod "21dfe5ce-4935-46c4-8124-cb4fecf0a906" (UID: "21dfe5ce-4935-46c4-8124-cb4fecf0a906"). InnerVolumeSpecName "kube-api-access-h2t5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.489230 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-inventory" (OuterVolumeSpecName: "inventory") pod "21dfe5ce-4935-46c4-8124-cb4fecf0a906" (UID: "21dfe5ce-4935-46c4-8124-cb4fecf0a906"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.489646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "21dfe5ce-4935-46c4-8124-cb4fecf0a906" (UID: "21dfe5ce-4935-46c4-8124-cb4fecf0a906"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.560651 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.560686 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21dfe5ce-4935-46c4-8124-cb4fecf0a906-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.560697 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2t5r\" (UniqueName: \"kubernetes.io/projected/21dfe5ce-4935-46c4-8124-cb4fecf0a906-kube-api-access-h2t5r\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.912119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" event={"ID":"21dfe5ce-4935-46c4-8124-cb4fecf0a906","Type":"ContainerDied","Data":"8a139e1277ad2a2d1242522f2a4b59cfd34fec9a8ff2b9e5d64f2238b56454a2"} Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.912499 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a139e1277ad2a2d1242522f2a4b59cfd34fec9a8ff2b9e5d64f2238b56454a2" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.912244 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ncft6" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.988594 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm"] Jan 29 03:59:24 crc kubenswrapper[4707]: E0129 03:59:24.989032 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dfe5ce-4935-46c4-8124-cb4fecf0a906" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.989053 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dfe5ce-4935-46c4-8124-cb4fecf0a906" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.989294 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="21dfe5ce-4935-46c4-8124-cb4fecf0a906" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.989959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.993096 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.993292 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.994009 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:59:24 crc kubenswrapper[4707]: I0129 03:59:24.994791 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.008867 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm"] Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.069969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76wdl\" (UniqueName: \"kubernetes.io/projected/c932c837-2020-4db4-8598-c4803eff8029-kube-api-access-76wdl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.070063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.070173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.172127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.172572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.172753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76wdl\" (UniqueName: \"kubernetes.io/projected/c932c837-2020-4db4-8598-c4803eff8029-kube-api-access-76wdl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.180225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.190132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.194763 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76wdl\" (UniqueName: \"kubernetes.io/projected/c932c837-2020-4db4-8598-c4803eff8029-kube-api-access-76wdl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.310610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.829502 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm"] Jan 29 03:59:25 crc kubenswrapper[4707]: I0129 03:59:25.923806 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" event={"ID":"c932c837-2020-4db4-8598-c4803eff8029","Type":"ContainerStarted","Data":"1915b8ea539c4690e4f83be8b4a2e61ab68f99115a7bad9dac96587d7842afad"} Jan 29 03:59:26 crc kubenswrapper[4707]: I0129 03:59:26.949905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" event={"ID":"c932c837-2020-4db4-8598-c4803eff8029","Type":"ContainerStarted","Data":"b01c311b4cf866a3fff7d0894adf6171f4da25edeacbfe067e2ab1f0f8e67f65"} Jan 29 03:59:26 crc kubenswrapper[4707]: I0129 03:59:26.978507 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" podStartSLOduration=2.518487376 podStartE2EDuration="2.978479804s" podCreationTimestamp="2026-01-29 03:59:24 +0000 UTC" firstStartedPulling="2026-01-29 03:59:25.837407473 +0000 UTC m=+1919.321636418" lastFinishedPulling="2026-01-29 03:59:26.297399941 +0000 UTC m=+1919.781628846" observedRunningTime="2026-01-29 03:59:26.975789058 +0000 UTC m=+1920.460017983" watchObservedRunningTime="2026-01-29 03:59:26.978479804 +0000 UTC m=+1920.462708719" Jan 29 03:59:36 crc kubenswrapper[4707]: I0129 03:59:36.035831 4707 generic.go:334] "Generic (PLEG): container finished" podID="c932c837-2020-4db4-8598-c4803eff8029" containerID="b01c311b4cf866a3fff7d0894adf6171f4da25edeacbfe067e2ab1f0f8e67f65" exitCode=0 Jan 29 03:59:36 crc kubenswrapper[4707]: I0129 03:59:36.035921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" event={"ID":"c932c837-2020-4db4-8598-c4803eff8029","Type":"ContainerDied","Data":"b01c311b4cf866a3fff7d0894adf6171f4da25edeacbfe067e2ab1f0f8e67f65"} Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.516414 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.664372 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-inventory\") pod \"c932c837-2020-4db4-8598-c4803eff8029\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.664702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-ssh-key-openstack-edpm-ipam\") pod \"c932c837-2020-4db4-8598-c4803eff8029\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.664789 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76wdl\" (UniqueName: \"kubernetes.io/projected/c932c837-2020-4db4-8598-c4803eff8029-kube-api-access-76wdl\") pod \"c932c837-2020-4db4-8598-c4803eff8029\" (UID: \"c932c837-2020-4db4-8598-c4803eff8029\") " Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.670933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c932c837-2020-4db4-8598-c4803eff8029-kube-api-access-76wdl" (OuterVolumeSpecName: "kube-api-access-76wdl") pod "c932c837-2020-4db4-8598-c4803eff8029" (UID: "c932c837-2020-4db4-8598-c4803eff8029"). InnerVolumeSpecName "kube-api-access-76wdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.691748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c932c837-2020-4db4-8598-c4803eff8029" (UID: "c932c837-2020-4db4-8598-c4803eff8029"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.696323 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-inventory" (OuterVolumeSpecName: "inventory") pod "c932c837-2020-4db4-8598-c4803eff8029" (UID: "c932c837-2020-4db4-8598-c4803eff8029"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.767710 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.767740 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76wdl\" (UniqueName: \"kubernetes.io/projected/c932c837-2020-4db4-8598-c4803eff8029-kube-api-access-76wdl\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:37 crc kubenswrapper[4707]: I0129 03:59:37.767751 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c932c837-2020-4db4-8598-c4803eff8029-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.055817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" event={"ID":"c932c837-2020-4db4-8598-c4803eff8029","Type":"ContainerDied","Data":"1915b8ea539c4690e4f83be8b4a2e61ab68f99115a7bad9dac96587d7842afad"} Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.055874 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1915b8ea539c4690e4f83be8b4a2e61ab68f99115a7bad9dac96587d7842afad" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.055925 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.143064 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p"] Jan 29 03:59:38 crc kubenswrapper[4707]: E0129 03:59:38.143922 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c932c837-2020-4db4-8598-c4803eff8029" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.143946 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c932c837-2020-4db4-8598-c4803eff8029" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.144211 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c932c837-2020-4db4-8598-c4803eff8029" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.147120 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.149320 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.149688 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.150314 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.150402 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.150904 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.150963 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.151005 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.151102 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.163212 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p"] Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bqrj\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-kube-api-access-6bqrj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277816 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277877 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.277982 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.278015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.278038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.278104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.278128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383678 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383729 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383804 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bqrj\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-kube-api-access-6bqrj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383924 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.383990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.384024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.384058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.390965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.391697 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.391864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.392112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.392218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.393031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.394033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.395468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.396468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.398879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.398988 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.402034 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.402849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bqrj\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-kube-api-access-6bqrj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.403130 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:38 crc kubenswrapper[4707]: I0129 03:59:38.474022 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 03:59:39 crc kubenswrapper[4707]: I0129 03:59:39.031604 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p"] Jan 29 03:59:39 crc kubenswrapper[4707]: I0129 03:59:39.034586 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 03:59:39 crc kubenswrapper[4707]: I0129 03:59:39.064231 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" event={"ID":"945bd58d-5ea2-4118-a675-3b7b127d9d4c","Type":"ContainerStarted","Data":"fbce099c151bc9b194002d66694f47c038ae6006723cb8efdd662f6b8d9d1923"} Jan 29 03:59:39 crc kubenswrapper[4707]: I0129 03:59:39.474943 4707 scope.go:117] "RemoveContainer" containerID="89cdc2f080c47e24de0dc68ad9f0a1af58af820c56016721a5bde4bfcabf3422" Jan 29 03:59:39 crc kubenswrapper[4707]: I0129 03:59:39.590373 4707 scope.go:117] "RemoveContainer" containerID="9f316524c65dbf31e260d11849fb730356d7a03a052a854227521385e116f055" Jan 29 03:59:40 crc kubenswrapper[4707]: I0129 03:59:40.074944 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" event={"ID":"945bd58d-5ea2-4118-a675-3b7b127d9d4c","Type":"ContainerStarted","Data":"32c8174e277301b9996f1dfdcad00602b56aee6507967b76a071e7c23196522d"} Jan 29 03:59:40 crc kubenswrapper[4707]: I0129 03:59:40.106024 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" podStartSLOduration=1.705759246 podStartE2EDuration="2.105997612s" podCreationTimestamp="2026-01-29 03:59:38 +0000 UTC" firstStartedPulling="2026-01-29 03:59:39.034255702 +0000 UTC m=+1932.518484607" lastFinishedPulling="2026-01-29 03:59:39.434494068 +0000 UTC m=+1932.918722973" observedRunningTime="2026-01-29 03:59:40.101033723 +0000 UTC m=+1933.585262668" watchObservedRunningTime="2026-01-29 03:59:40.105997612 +0000 UTC m=+1933.590226557" Jan 29 03:59:57 crc kubenswrapper[4707]: I0129 03:59:57.034417 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-zntq4"] Jan 29 03:59:57 crc kubenswrapper[4707]: I0129 03:59:57.042009 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-zntq4"] Jan 29 03:59:57 crc kubenswrapper[4707]: I0129 03:59:57.272870 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f02b4a-bf97-46a9-94a1-b60db6b01a33" path="/var/lib/kubelet/pods/90f02b4a-bf97-46a9-94a1-b60db6b01a33/volumes" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.154956 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms"] Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.157214 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.159787 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.160610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.164583 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms"] Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.277932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5954036d-8bd6-4b27-9156-75fdc0744f98-config-volume\") pod \"collect-profiles-29494320-sktms\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.277995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5954036d-8bd6-4b27-9156-75fdc0744f98-secret-volume\") pod \"collect-profiles-29494320-sktms\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.278142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnb4j\" (UniqueName: \"kubernetes.io/projected/5954036d-8bd6-4b27-9156-75fdc0744f98-kube-api-access-pnb4j\") pod \"collect-profiles-29494320-sktms\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.380058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnb4j\" (UniqueName: \"kubernetes.io/projected/5954036d-8bd6-4b27-9156-75fdc0744f98-kube-api-access-pnb4j\") pod \"collect-profiles-29494320-sktms\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.380302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5954036d-8bd6-4b27-9156-75fdc0744f98-config-volume\") pod \"collect-profiles-29494320-sktms\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.380335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5954036d-8bd6-4b27-9156-75fdc0744f98-secret-volume\") pod \"collect-profiles-29494320-sktms\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.381418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5954036d-8bd6-4b27-9156-75fdc0744f98-config-volume\") pod \"collect-profiles-29494320-sktms\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.387002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5954036d-8bd6-4b27-9156-75fdc0744f98-secret-volume\") pod \"collect-profiles-29494320-sktms\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.426246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnb4j\" (UniqueName: \"kubernetes.io/projected/5954036d-8bd6-4b27-9156-75fdc0744f98-kube-api-access-pnb4j\") pod \"collect-profiles-29494320-sktms\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.479801 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:00 crc kubenswrapper[4707]: I0129 04:00:00.929070 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms"] Jan 29 04:00:01 crc kubenswrapper[4707]: I0129 04:00:01.276134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" event={"ID":"5954036d-8bd6-4b27-9156-75fdc0744f98","Type":"ContainerStarted","Data":"68d5a0dd3f54f372e12af3ecd9080a6d2563496a6434ebcef69a8b9a02a5cef6"} Jan 29 04:00:01 crc kubenswrapper[4707]: I0129 04:00:01.277621 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" event={"ID":"5954036d-8bd6-4b27-9156-75fdc0744f98","Type":"ContainerStarted","Data":"0047e147d442b702bd9955b7b0740220286b9ca4bade00103040f3083d892e01"} Jan 29 04:00:01 crc kubenswrapper[4707]: I0129 04:00:01.289603 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" podStartSLOduration=1.289587002 podStartE2EDuration="1.289587002s" podCreationTimestamp="2026-01-29 04:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:00:01.285018084 +0000 UTC m=+1954.769246999" watchObservedRunningTime="2026-01-29 04:00:01.289587002 +0000 UTC m=+1954.773815907" Jan 29 04:00:02 crc kubenswrapper[4707]: I0129 04:00:02.275826 4707 generic.go:334] "Generic (PLEG): container finished" podID="5954036d-8bd6-4b27-9156-75fdc0744f98" containerID="68d5a0dd3f54f372e12af3ecd9080a6d2563496a6434ebcef69a8b9a02a5cef6" exitCode=0 Jan 29 04:00:02 crc kubenswrapper[4707]: I0129 04:00:02.275891 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" event={"ID":"5954036d-8bd6-4b27-9156-75fdc0744f98","Type":"ContainerDied","Data":"68d5a0dd3f54f372e12af3ecd9080a6d2563496a6434ebcef69a8b9a02a5cef6"} Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.614126 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.652793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5954036d-8bd6-4b27-9156-75fdc0744f98-secret-volume\") pod \"5954036d-8bd6-4b27-9156-75fdc0744f98\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.652916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5954036d-8bd6-4b27-9156-75fdc0744f98-config-volume\") pod \"5954036d-8bd6-4b27-9156-75fdc0744f98\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.652938 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnb4j\" (UniqueName: \"kubernetes.io/projected/5954036d-8bd6-4b27-9156-75fdc0744f98-kube-api-access-pnb4j\") pod \"5954036d-8bd6-4b27-9156-75fdc0744f98\" (UID: \"5954036d-8bd6-4b27-9156-75fdc0744f98\") " Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.656219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5954036d-8bd6-4b27-9156-75fdc0744f98-config-volume" (OuterVolumeSpecName: "config-volume") pod "5954036d-8bd6-4b27-9156-75fdc0744f98" (UID: "5954036d-8bd6-4b27-9156-75fdc0744f98"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.661863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5954036d-8bd6-4b27-9156-75fdc0744f98-kube-api-access-pnb4j" (OuterVolumeSpecName: "kube-api-access-pnb4j") pod "5954036d-8bd6-4b27-9156-75fdc0744f98" (UID: "5954036d-8bd6-4b27-9156-75fdc0744f98"). InnerVolumeSpecName "kube-api-access-pnb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.662577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5954036d-8bd6-4b27-9156-75fdc0744f98-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5954036d-8bd6-4b27-9156-75fdc0744f98" (UID: "5954036d-8bd6-4b27-9156-75fdc0744f98"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.755750 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5954036d-8bd6-4b27-9156-75fdc0744f98-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.755788 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5954036d-8bd6-4b27-9156-75fdc0744f98-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:03 crc kubenswrapper[4707]: I0129 04:00:03.755799 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnb4j\" (UniqueName: \"kubernetes.io/projected/5954036d-8bd6-4b27-9156-75fdc0744f98-kube-api-access-pnb4j\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:04 crc kubenswrapper[4707]: I0129 04:00:04.303776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" event={"ID":"5954036d-8bd6-4b27-9156-75fdc0744f98","Type":"ContainerDied","Data":"0047e147d442b702bd9955b7b0740220286b9ca4bade00103040f3083d892e01"} Jan 29 04:00:04 crc kubenswrapper[4707]: I0129 04:00:04.303842 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0047e147d442b702bd9955b7b0740220286b9ca4bade00103040f3083d892e01" Jan 29 04:00:04 crc kubenswrapper[4707]: I0129 04:00:04.303907 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms" Jan 29 04:00:12 crc kubenswrapper[4707]: I0129 04:00:12.377116 4707 generic.go:334] "Generic (PLEG): container finished" podID="945bd58d-5ea2-4118-a675-3b7b127d9d4c" containerID="32c8174e277301b9996f1dfdcad00602b56aee6507967b76a071e7c23196522d" exitCode=0 Jan 29 04:00:12 crc kubenswrapper[4707]: I0129 04:00:12.377258 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" event={"ID":"945bd58d-5ea2-4118-a675-3b7b127d9d4c","Type":"ContainerDied","Data":"32c8174e277301b9996f1dfdcad00602b56aee6507967b76a071e7c23196522d"} Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.845030 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.874645 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ovn-combined-ca-bundle\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.874930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bqrj\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-kube-api-access-6bqrj\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.875051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-inventory\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.875179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.875300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-neutron-metadata-combined-ca-bundle\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.875390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-nova-combined-ca-bundle\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.875551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-repo-setup-combined-ca-bundle\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.875675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.875785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-telemetry-combined-ca-bundle\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.875893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.876041 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.876154 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-bootstrap-combined-ca-bundle\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.876284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-libvirt-combined-ca-bundle\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.876402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ssh-key-openstack-edpm-ipam\") pod \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\" (UID: \"945bd58d-5ea2-4118-a675-3b7b127d9d4c\") " Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.880567 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-kube-api-access-6bqrj" (OuterVolumeSpecName: "kube-api-access-6bqrj") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "kube-api-access-6bqrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.883926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.892592 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.895454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.896774 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.898627 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.899850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.899866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.900006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.900488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.911095 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.916572 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.919110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.929343 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-inventory" (OuterVolumeSpecName: "inventory") pod "945bd58d-5ea2-4118-a675-3b7b127d9d4c" (UID: "945bd58d-5ea2-4118-a675-3b7b127d9d4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.979265 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.979467 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.979619 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bqrj\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-kube-api-access-6bqrj\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.979685 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.979856 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.979958 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.980025 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.980080 4707 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.980167 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.980281 4707 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.980359 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.980440 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/945bd58d-5ea2-4118-a675-3b7b127d9d4c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.980510 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:13 crc kubenswrapper[4707]: I0129 04:00:13.980591 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945bd58d-5ea2-4118-a675-3b7b127d9d4c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.398074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" event={"ID":"945bd58d-5ea2-4118-a675-3b7b127d9d4c","Type":"ContainerDied","Data":"fbce099c151bc9b194002d66694f47c038ae6006723cb8efdd662f6b8d9d1923"} Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.398849 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbce099c151bc9b194002d66694f47c038ae6006723cb8efdd662f6b8d9d1923" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.398819 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.563576 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj"] Jan 29 04:00:14 crc kubenswrapper[4707]: E0129 04:00:14.564150 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5954036d-8bd6-4b27-9156-75fdc0744f98" containerName="collect-profiles" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.564192 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5954036d-8bd6-4b27-9156-75fdc0744f98" containerName="collect-profiles" Jan 29 04:00:14 crc kubenswrapper[4707]: E0129 04:00:14.564229 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945bd58d-5ea2-4118-a675-3b7b127d9d4c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.564239 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="945bd58d-5ea2-4118-a675-3b7b127d9d4c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.564627 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5954036d-8bd6-4b27-9156-75fdc0744f98" containerName="collect-profiles" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.564656 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="945bd58d-5ea2-4118-a675-3b7b127d9d4c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.565737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.571426 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.571438 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.571460 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.571553 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.571747 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.579511 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj"] Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.694881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf44t\" (UniqueName: \"kubernetes.io/projected/80667caf-0ec4-4178-96b2-93b148db9c1e-kube-api-access-lf44t\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.694932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/80667caf-0ec4-4178-96b2-93b148db9c1e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.695194 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.695570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.695677 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.797827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.798242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.798376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.798566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf44t\" (UniqueName: \"kubernetes.io/projected/80667caf-0ec4-4178-96b2-93b148db9c1e-kube-api-access-lf44t\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.798695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/80667caf-0ec4-4178-96b2-93b148db9c1e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.800520 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/80667caf-0ec4-4178-96b2-93b148db9c1e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.803096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.803698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.804196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.817371 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf44t\" (UniqueName: \"kubernetes.io/projected/80667caf-0ec4-4178-96b2-93b148db9c1e-kube-api-access-lf44t\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bzfvj\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:14 crc kubenswrapper[4707]: I0129 04:00:14.883372 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:00:15 crc kubenswrapper[4707]: I0129 04:00:15.459284 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj"] Jan 29 04:00:16 crc kubenswrapper[4707]: I0129 04:00:16.424984 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" event={"ID":"80667caf-0ec4-4178-96b2-93b148db9c1e","Type":"ContainerStarted","Data":"37803b2fc31658ed090886933cd0a023279886a3506caaee2d223d2c6b8a92c8"} Jan 29 04:00:16 crc kubenswrapper[4707]: I0129 04:00:16.425619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" event={"ID":"80667caf-0ec4-4178-96b2-93b148db9c1e","Type":"ContainerStarted","Data":"f07b98f473ec85523c80d54678774c0c6db11576850de32574bd61b2d99adfda"} Jan 29 04:00:16 crc kubenswrapper[4707]: I0129 04:00:16.448166 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" podStartSLOduration=1.832131105 podStartE2EDuration="2.448140638s" podCreationTimestamp="2026-01-29 04:00:14 +0000 UTC" firstStartedPulling="2026-01-29 04:00:15.464380458 +0000 UTC m=+1968.948609373" lastFinishedPulling="2026-01-29 04:00:16.080390001 +0000 UTC m=+1969.564618906" observedRunningTime="2026-01-29 04:00:16.444608529 +0000 UTC m=+1969.928837454" watchObservedRunningTime="2026-01-29 04:00:16.448140638 +0000 UTC m=+1969.932369543" Jan 29 04:00:33 crc kubenswrapper[4707]: I0129 04:00:33.463415 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:00:33 crc kubenswrapper[4707]: I0129 04:00:33.464094 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:00:39 crc kubenswrapper[4707]: I0129 04:00:39.687287 4707 scope.go:117] "RemoveContainer" containerID="bcc5990f8297b84dc1ce717f93a19bc0397cd8a458ae463db68bf4c18fea9903" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.152317 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29494321-prr9g"] Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.154484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.161223 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29494321-prr9g"] Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.264424 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-combined-ca-bundle\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.264477 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-fernet-keys\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.264608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-config-data\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.264697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7rbw\" (UniqueName: \"kubernetes.io/projected/b1afb9c0-b9e9-46d1-b608-36148c671d74-kube-api-access-s7rbw\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.366766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7rbw\" (UniqueName: \"kubernetes.io/projected/b1afb9c0-b9e9-46d1-b608-36148c671d74-kube-api-access-s7rbw\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.366845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-combined-ca-bundle\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.366877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-fernet-keys\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.366988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-config-data\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.372755 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-fernet-keys\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.373240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-combined-ca-bundle\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.383478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-config-data\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.383689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7rbw\" (UniqueName: \"kubernetes.io/projected/b1afb9c0-b9e9-46d1-b608-36148c671d74-kube-api-access-s7rbw\") pod \"keystone-cron-29494321-prr9g\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:00 crc kubenswrapper[4707]: I0129 04:01:00.482888 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:01 crc kubenswrapper[4707]: I0129 04:01:01.113121 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29494321-prr9g"] Jan 29 04:01:01 crc kubenswrapper[4707]: I0129 04:01:01.863455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29494321-prr9g" event={"ID":"b1afb9c0-b9e9-46d1-b608-36148c671d74","Type":"ContainerStarted","Data":"01205deafdb721bd6d891def0c80b92cc2f35c0a4bac3cbf591cc1c3992c0f90"} Jan 29 04:01:01 crc kubenswrapper[4707]: I0129 04:01:01.863848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29494321-prr9g" event={"ID":"b1afb9c0-b9e9-46d1-b608-36148c671d74","Type":"ContainerStarted","Data":"b39879c6d2742b5d86aa91b3c649ed0429598cab49392b71bff1d8d081ddfa4b"} Jan 29 04:01:01 crc kubenswrapper[4707]: I0129 04:01:01.891366 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29494321-prr9g" podStartSLOduration=1.8913430949999999 podStartE2EDuration="1.891343095s" podCreationTimestamp="2026-01-29 04:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:01:01.879515892 +0000 UTC m=+2015.363744797" watchObservedRunningTime="2026-01-29 04:01:01.891343095 +0000 UTC m=+2015.375572000" Jan 29 04:01:03 crc kubenswrapper[4707]: I0129 04:01:03.463451 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:01:03 crc kubenswrapper[4707]: I0129 04:01:03.464457 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:01:03 crc kubenswrapper[4707]: I0129 04:01:03.887824 4707 generic.go:334] "Generic (PLEG): container finished" podID="b1afb9c0-b9e9-46d1-b608-36148c671d74" containerID="01205deafdb721bd6d891def0c80b92cc2f35c0a4bac3cbf591cc1c3992c0f90" exitCode=0 Jan 29 04:01:03 crc kubenswrapper[4707]: I0129 04:01:03.887879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29494321-prr9g" event={"ID":"b1afb9c0-b9e9-46d1-b608-36148c671d74","Type":"ContainerDied","Data":"01205deafdb721bd6d891def0c80b92cc2f35c0a4bac3cbf591cc1c3992c0f90"} Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.231009 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.276295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-combined-ca-bundle\") pod \"b1afb9c0-b9e9-46d1-b608-36148c671d74\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.276379 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-fernet-keys\") pod \"b1afb9c0-b9e9-46d1-b608-36148c671d74\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.276408 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-config-data\") pod \"b1afb9c0-b9e9-46d1-b608-36148c671d74\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.276761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7rbw\" (UniqueName: \"kubernetes.io/projected/b1afb9c0-b9e9-46d1-b608-36148c671d74-kube-api-access-s7rbw\") pod \"b1afb9c0-b9e9-46d1-b608-36148c671d74\" (UID: \"b1afb9c0-b9e9-46d1-b608-36148c671d74\") " Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.282371 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b1afb9c0-b9e9-46d1-b608-36148c671d74" (UID: "b1afb9c0-b9e9-46d1-b608-36148c671d74"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.284835 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1afb9c0-b9e9-46d1-b608-36148c671d74-kube-api-access-s7rbw" (OuterVolumeSpecName: "kube-api-access-s7rbw") pod "b1afb9c0-b9e9-46d1-b608-36148c671d74" (UID: "b1afb9c0-b9e9-46d1-b608-36148c671d74"). InnerVolumeSpecName "kube-api-access-s7rbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.305449 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1afb9c0-b9e9-46d1-b608-36148c671d74" (UID: "b1afb9c0-b9e9-46d1-b608-36148c671d74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.334213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-config-data" (OuterVolumeSpecName: "config-data") pod "b1afb9c0-b9e9-46d1-b608-36148c671d74" (UID: "b1afb9c0-b9e9-46d1-b608-36148c671d74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.378984 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.379024 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.379039 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1afb9c0-b9e9-46d1-b608-36148c671d74-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.379053 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7rbw\" (UniqueName: \"kubernetes.io/projected/b1afb9c0-b9e9-46d1-b608-36148c671d74-kube-api-access-s7rbw\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.904845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29494321-prr9g" event={"ID":"b1afb9c0-b9e9-46d1-b608-36148c671d74","Type":"ContainerDied","Data":"b39879c6d2742b5d86aa91b3c649ed0429598cab49392b71bff1d8d081ddfa4b"} Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.905247 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39879c6d2742b5d86aa91b3c649ed0429598cab49392b71bff1d8d081ddfa4b" Jan 29 04:01:05 crc kubenswrapper[4707]: I0129 04:01:05.904892 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29494321-prr9g" Jan 29 04:01:12 crc kubenswrapper[4707]: I0129 04:01:12.964675 4707 generic.go:334] "Generic (PLEG): container finished" podID="80667caf-0ec4-4178-96b2-93b148db9c1e" containerID="37803b2fc31658ed090886933cd0a023279886a3506caaee2d223d2c6b8a92c8" exitCode=0 Jan 29 04:01:12 crc kubenswrapper[4707]: I0129 04:01:12.964886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" event={"ID":"80667caf-0ec4-4178-96b2-93b148db9c1e","Type":"ContainerDied","Data":"37803b2fc31658ed090886933cd0a023279886a3506caaee2d223d2c6b8a92c8"} Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.373735 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.554834 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/80667caf-0ec4-4178-96b2-93b148db9c1e-ovncontroller-config-0\") pod \"80667caf-0ec4-4178-96b2-93b148db9c1e\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.554887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ovn-combined-ca-bundle\") pod \"80667caf-0ec4-4178-96b2-93b148db9c1e\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.554912 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf44t\" (UniqueName: \"kubernetes.io/projected/80667caf-0ec4-4178-96b2-93b148db9c1e-kube-api-access-lf44t\") pod \"80667caf-0ec4-4178-96b2-93b148db9c1e\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.554954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-inventory\") pod \"80667caf-0ec4-4178-96b2-93b148db9c1e\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.555118 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ssh-key-openstack-edpm-ipam\") pod \"80667caf-0ec4-4178-96b2-93b148db9c1e\" (UID: \"80667caf-0ec4-4178-96b2-93b148db9c1e\") " Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.560767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "80667caf-0ec4-4178-96b2-93b148db9c1e" (UID: "80667caf-0ec4-4178-96b2-93b148db9c1e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.563002 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80667caf-0ec4-4178-96b2-93b148db9c1e-kube-api-access-lf44t" (OuterVolumeSpecName: "kube-api-access-lf44t") pod "80667caf-0ec4-4178-96b2-93b148db9c1e" (UID: "80667caf-0ec4-4178-96b2-93b148db9c1e"). InnerVolumeSpecName "kube-api-access-lf44t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.580740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80667caf-0ec4-4178-96b2-93b148db9c1e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "80667caf-0ec4-4178-96b2-93b148db9c1e" (UID: "80667caf-0ec4-4178-96b2-93b148db9c1e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.585762 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-inventory" (OuterVolumeSpecName: "inventory") pod "80667caf-0ec4-4178-96b2-93b148db9c1e" (UID: "80667caf-0ec4-4178-96b2-93b148db9c1e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.606450 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "80667caf-0ec4-4178-96b2-93b148db9c1e" (UID: "80667caf-0ec4-4178-96b2-93b148db9c1e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.657779 4707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/80667caf-0ec4-4178-96b2-93b148db9c1e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.657821 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.657831 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf44t\" (UniqueName: \"kubernetes.io/projected/80667caf-0ec4-4178-96b2-93b148db9c1e-kube-api-access-lf44t\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.657842 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.657852 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80667caf-0ec4-4178-96b2-93b148db9c1e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.984322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" event={"ID":"80667caf-0ec4-4178-96b2-93b148db9c1e","Type":"ContainerDied","Data":"f07b98f473ec85523c80d54678774c0c6db11576850de32574bd61b2d99adfda"} Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.984367 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f07b98f473ec85523c80d54678774c0c6db11576850de32574bd61b2d99adfda" Jan 29 04:01:14 crc kubenswrapper[4707]: I0129 04:01:14.984697 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bzfvj" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.073779 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n"] Jan 29 04:01:15 crc kubenswrapper[4707]: E0129 04:01:15.074279 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80667caf-0ec4-4178-96b2-93b148db9c1e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.074303 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="80667caf-0ec4-4178-96b2-93b148db9c1e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 29 04:01:15 crc kubenswrapper[4707]: E0129 04:01:15.074328 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1afb9c0-b9e9-46d1-b608-36148c671d74" containerName="keystone-cron" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.074338 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1afb9c0-b9e9-46d1-b608-36148c671d74" containerName="keystone-cron" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.074608 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="80667caf-0ec4-4178-96b2-93b148db9c1e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.074647 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1afb9c0-b9e9-46d1-b608-36148c671d74" containerName="keystone-cron" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.075792 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.078364 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.078439 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.078702 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.078712 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.078888 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.079326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.091657 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n"] Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.269689 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.269757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.269773 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b526m\" (UniqueName: \"kubernetes.io/projected/9f02add7-c3ef-4952-b83f-1799bf08bad0-kube-api-access-b526m\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.269817 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.269835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.270555 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.372885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.372977 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.373009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.373028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b526m\" (UniqueName: \"kubernetes.io/projected/9f02add7-c3ef-4952-b83f-1799bf08bad0-kube-api-access-b526m\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.373054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.373084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.377282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.377778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.378042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.378218 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.378860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.406791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b526m\" (UniqueName: \"kubernetes.io/projected/9f02add7-c3ef-4952-b83f-1799bf08bad0-kube-api-access-b526m\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:15 crc kubenswrapper[4707]: I0129 04:01:15.701310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:01:16 crc kubenswrapper[4707]: I0129 04:01:16.236495 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n"] Jan 29 04:01:17 crc kubenswrapper[4707]: I0129 04:01:17.024005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" event={"ID":"9f02add7-c3ef-4952-b83f-1799bf08bad0","Type":"ContainerStarted","Data":"533d6d3a0284b6036e219feed6e161174cbec242b748cead1c65a2ec69e10cd5"} Jan 29 04:01:17 crc kubenswrapper[4707]: I0129 04:01:17.024268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" event={"ID":"9f02add7-c3ef-4952-b83f-1799bf08bad0","Type":"ContainerStarted","Data":"d64c9a336f15473a2cc86984c68086680b5d84f8cc46fbe2c5df282622e86fcb"} Jan 29 04:01:17 crc kubenswrapper[4707]: I0129 04:01:17.041499 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" podStartSLOduration=1.5758260929999999 podStartE2EDuration="2.041481259s" podCreationTimestamp="2026-01-29 04:01:15 +0000 UTC" firstStartedPulling="2026-01-29 04:01:16.246826304 +0000 UTC m=+2029.731055209" lastFinishedPulling="2026-01-29 04:01:16.71248146 +0000 UTC m=+2030.196710375" observedRunningTime="2026-01-29 04:01:17.037372393 +0000 UTC m=+2030.521601298" watchObservedRunningTime="2026-01-29 04:01:17.041481259 +0000 UTC m=+2030.525710164" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.634655 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kzh2x"] Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.637909 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.666004 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kzh2x"] Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.780429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-utilities\") pod \"redhat-operators-kzh2x\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.780503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkz9f\" (UniqueName: \"kubernetes.io/projected/fcca404e-23d7-47d5-9598-d6f07e65759a-kube-api-access-tkz9f\") pod \"redhat-operators-kzh2x\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.780645 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-catalog-content\") pod \"redhat-operators-kzh2x\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.882764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-catalog-content\") pod \"redhat-operators-kzh2x\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.882878 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-utilities\") pod \"redhat-operators-kzh2x\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.882925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkz9f\" (UniqueName: \"kubernetes.io/projected/fcca404e-23d7-47d5-9598-d6f07e65759a-kube-api-access-tkz9f\") pod \"redhat-operators-kzh2x\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.883597 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-catalog-content\") pod \"redhat-operators-kzh2x\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.883959 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-utilities\") pod \"redhat-operators-kzh2x\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:24 crc kubenswrapper[4707]: I0129 04:01:24.904325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkz9f\" (UniqueName: \"kubernetes.io/projected/fcca404e-23d7-47d5-9598-d6f07e65759a-kube-api-access-tkz9f\") pod \"redhat-operators-kzh2x\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:25 crc kubenswrapper[4707]: I0129 04:01:25.027260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:25 crc kubenswrapper[4707]: I0129 04:01:25.514352 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kzh2x"] Jan 29 04:01:26 crc kubenswrapper[4707]: I0129 04:01:26.122689 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerID="bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311" exitCode=0 Jan 29 04:01:26 crc kubenswrapper[4707]: I0129 04:01:26.122747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzh2x" event={"ID":"fcca404e-23d7-47d5-9598-d6f07e65759a","Type":"ContainerDied","Data":"bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311"} Jan 29 04:01:26 crc kubenswrapper[4707]: I0129 04:01:26.122778 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzh2x" event={"ID":"fcca404e-23d7-47d5-9598-d6f07e65759a","Type":"ContainerStarted","Data":"9cbac5f75f994324b41dc36a1f6461e69786d7ce8cbf3734668a456ac7a1e615"} Jan 29 04:01:27 crc kubenswrapper[4707]: I0129 04:01:27.132092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzh2x" event={"ID":"fcca404e-23d7-47d5-9598-d6f07e65759a","Type":"ContainerStarted","Data":"a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050"} Jan 29 04:01:30 crc kubenswrapper[4707]: I0129 04:01:30.165180 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerID="a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050" exitCode=0 Jan 29 04:01:30 crc kubenswrapper[4707]: I0129 04:01:30.165268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzh2x" event={"ID":"fcca404e-23d7-47d5-9598-d6f07e65759a","Type":"ContainerDied","Data":"a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050"} Jan 29 04:01:31 crc kubenswrapper[4707]: I0129 04:01:31.179284 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzh2x" event={"ID":"fcca404e-23d7-47d5-9598-d6f07e65759a","Type":"ContainerStarted","Data":"c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d"} Jan 29 04:01:31 crc kubenswrapper[4707]: I0129 04:01:31.215355 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kzh2x" podStartSLOduration=2.767105186 podStartE2EDuration="7.215329288s" podCreationTimestamp="2026-01-29 04:01:24 +0000 UTC" firstStartedPulling="2026-01-29 04:01:26.125748179 +0000 UTC m=+2039.609977084" lastFinishedPulling="2026-01-29 04:01:30.573972271 +0000 UTC m=+2044.058201186" observedRunningTime="2026-01-29 04:01:31.204997226 +0000 UTC m=+2044.689226161" watchObservedRunningTime="2026-01-29 04:01:31.215329288 +0000 UTC m=+2044.699558203" Jan 29 04:01:33 crc kubenswrapper[4707]: I0129 04:01:33.463842 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:01:33 crc kubenswrapper[4707]: I0129 04:01:33.464421 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:01:33 crc kubenswrapper[4707]: I0129 04:01:33.464517 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 04:01:33 crc kubenswrapper[4707]: I0129 04:01:33.465853 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9075a066370335e7cc1a1ac646b318a6399393468866cff65b48a27e1e3b8714"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 04:01:33 crc kubenswrapper[4707]: I0129 04:01:33.465972 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://9075a066370335e7cc1a1ac646b318a6399393468866cff65b48a27e1e3b8714" gracePeriod=600 Jan 29 04:01:34 crc kubenswrapper[4707]: I0129 04:01:34.232590 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="9075a066370335e7cc1a1ac646b318a6399393468866cff65b48a27e1e3b8714" exitCode=0 Jan 29 04:01:34 crc kubenswrapper[4707]: I0129 04:01:34.232649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"9075a066370335e7cc1a1ac646b318a6399393468866cff65b48a27e1e3b8714"} Jan 29 04:01:34 crc kubenswrapper[4707]: I0129 04:01:34.232970 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024"} Jan 29 04:01:34 crc kubenswrapper[4707]: I0129 04:01:34.233004 4707 scope.go:117] "RemoveContainer" containerID="f3a708dddfc2da9998824e1b9ef1ccc5ed9625036f5480f74cee959623be7204" Jan 29 04:01:35 crc kubenswrapper[4707]: I0129 04:01:35.028370 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:35 crc kubenswrapper[4707]: I0129 04:01:35.028778 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:36 crc kubenswrapper[4707]: I0129 04:01:36.089322 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kzh2x" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerName="registry-server" probeResult="failure" output=< Jan 29 04:01:36 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 04:01:36 crc kubenswrapper[4707]: > Jan 29 04:01:45 crc kubenswrapper[4707]: I0129 04:01:45.101856 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:45 crc kubenswrapper[4707]: I0129 04:01:45.182980 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:45 crc kubenswrapper[4707]: I0129 04:01:45.356864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kzh2x"] Jan 29 04:01:46 crc kubenswrapper[4707]: I0129 04:01:46.369612 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kzh2x" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerName="registry-server" containerID="cri-o://c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d" gracePeriod=2 Jan 29 04:01:46 crc kubenswrapper[4707]: I0129 04:01:46.821990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:46 crc kubenswrapper[4707]: I0129 04:01:46.918375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-catalog-content\") pod \"fcca404e-23d7-47d5-9598-d6f07e65759a\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " Jan 29 04:01:46 crc kubenswrapper[4707]: I0129 04:01:46.918480 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkz9f\" (UniqueName: \"kubernetes.io/projected/fcca404e-23d7-47d5-9598-d6f07e65759a-kube-api-access-tkz9f\") pod \"fcca404e-23d7-47d5-9598-d6f07e65759a\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " Jan 29 04:01:46 crc kubenswrapper[4707]: I0129 04:01:46.918577 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-utilities\") pod \"fcca404e-23d7-47d5-9598-d6f07e65759a\" (UID: \"fcca404e-23d7-47d5-9598-d6f07e65759a\") " Jan 29 04:01:46 crc kubenswrapper[4707]: I0129 04:01:46.919628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-utilities" (OuterVolumeSpecName: "utilities") pod "fcca404e-23d7-47d5-9598-d6f07e65759a" (UID: "fcca404e-23d7-47d5-9598-d6f07e65759a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:01:46 crc kubenswrapper[4707]: I0129 04:01:46.920858 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:46 crc kubenswrapper[4707]: I0129 04:01:46.924514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcca404e-23d7-47d5-9598-d6f07e65759a-kube-api-access-tkz9f" (OuterVolumeSpecName: "kube-api-access-tkz9f") pod "fcca404e-23d7-47d5-9598-d6f07e65759a" (UID: "fcca404e-23d7-47d5-9598-d6f07e65759a"). InnerVolumeSpecName "kube-api-access-tkz9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.023152 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkz9f\" (UniqueName: \"kubernetes.io/projected/fcca404e-23d7-47d5-9598-d6f07e65759a-kube-api-access-tkz9f\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.048760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcca404e-23d7-47d5-9598-d6f07e65759a" (UID: "fcca404e-23d7-47d5-9598-d6f07e65759a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.125615 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcca404e-23d7-47d5-9598-d6f07e65759a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.380986 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerID="c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d" exitCode=0 Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.381029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzh2x" event={"ID":"fcca404e-23d7-47d5-9598-d6f07e65759a","Type":"ContainerDied","Data":"c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d"} Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.381059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzh2x" event={"ID":"fcca404e-23d7-47d5-9598-d6f07e65759a","Type":"ContainerDied","Data":"9cbac5f75f994324b41dc36a1f6461e69786d7ce8cbf3734668a456ac7a1e615"} Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.381080 4707 scope.go:117] "RemoveContainer" containerID="c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.381091 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kzh2x" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.410422 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kzh2x"] Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.411360 4707 scope.go:117] "RemoveContainer" containerID="a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.419796 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kzh2x"] Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.437777 4707 scope.go:117] "RemoveContainer" containerID="bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.482005 4707 scope.go:117] "RemoveContainer" containerID="c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d" Jan 29 04:01:47 crc kubenswrapper[4707]: E0129 04:01:47.482475 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d\": container with ID starting with c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d not found: ID does not exist" containerID="c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.482519 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d"} err="failed to get container status \"c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d\": rpc error: code = NotFound desc = could not find container \"c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d\": container with ID starting with c68b312ff9a7f1241b8c0bb9691fda9b74c2f830452370264acaa2ec58d8e04d not found: ID does not exist" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.482568 4707 scope.go:117] "RemoveContainer" containerID="a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050" Jan 29 04:01:47 crc kubenswrapper[4707]: E0129 04:01:47.482886 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050\": container with ID starting with a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050 not found: ID does not exist" containerID="a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.482912 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050"} err="failed to get container status \"a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050\": rpc error: code = NotFound desc = could not find container \"a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050\": container with ID starting with a003f03c043c870a3028abd9344d311465a7d7a281dc82e9b5fb7757d93fb050 not found: ID does not exist" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.482926 4707 scope.go:117] "RemoveContainer" containerID="bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311" Jan 29 04:01:47 crc kubenswrapper[4707]: E0129 04:01:47.483200 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311\": container with ID starting with bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311 not found: ID does not exist" containerID="bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311" Jan 29 04:01:47 crc kubenswrapper[4707]: I0129 04:01:47.483227 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311"} err="failed to get container status \"bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311\": rpc error: code = NotFound desc = could not find container \"bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311\": container with ID starting with bcb78c7e22fcd7278e976cd57ba028079aeacf7bda7f365169521aa7d3e00311 not found: ID does not exist" Jan 29 04:01:49 crc kubenswrapper[4707]: I0129 04:01:49.256407 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" path="/var/lib/kubelet/pods/fcca404e-23d7-47d5-9598-d6f07e65759a/volumes" Jan 29 04:02:03 crc kubenswrapper[4707]: I0129 04:02:03.567760 4707 generic.go:334] "Generic (PLEG): container finished" podID="9f02add7-c3ef-4952-b83f-1799bf08bad0" containerID="533d6d3a0284b6036e219feed6e161174cbec242b748cead1c65a2ec69e10cd5" exitCode=0 Jan 29 04:02:03 crc kubenswrapper[4707]: I0129 04:02:03.567847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" event={"ID":"9f02add7-c3ef-4952-b83f-1799bf08bad0","Type":"ContainerDied","Data":"533d6d3a0284b6036e219feed6e161174cbec242b748cead1c65a2ec69e10cd5"} Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.046909 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.148520 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-ssh-key-openstack-edpm-ipam\") pod \"9f02add7-c3ef-4952-b83f-1799bf08bad0\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.149003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9f02add7-c3ef-4952-b83f-1799bf08bad0\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.149170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-nova-metadata-neutron-config-0\") pod \"9f02add7-c3ef-4952-b83f-1799bf08bad0\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.149246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-inventory\") pod \"9f02add7-c3ef-4952-b83f-1799bf08bad0\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.149307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b526m\" (UniqueName: \"kubernetes.io/projected/9f02add7-c3ef-4952-b83f-1799bf08bad0-kube-api-access-b526m\") pod \"9f02add7-c3ef-4952-b83f-1799bf08bad0\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.149334 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-metadata-combined-ca-bundle\") pod \"9f02add7-c3ef-4952-b83f-1799bf08bad0\" (UID: \"9f02add7-c3ef-4952-b83f-1799bf08bad0\") " Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.154825 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9f02add7-c3ef-4952-b83f-1799bf08bad0" (UID: "9f02add7-c3ef-4952-b83f-1799bf08bad0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.155084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f02add7-c3ef-4952-b83f-1799bf08bad0-kube-api-access-b526m" (OuterVolumeSpecName: "kube-api-access-b526m") pod "9f02add7-c3ef-4952-b83f-1799bf08bad0" (UID: "9f02add7-c3ef-4952-b83f-1799bf08bad0"). InnerVolumeSpecName "kube-api-access-b526m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.178096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9f02add7-c3ef-4952-b83f-1799bf08bad0" (UID: "9f02add7-c3ef-4952-b83f-1799bf08bad0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.178573 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-inventory" (OuterVolumeSpecName: "inventory") pod "9f02add7-c3ef-4952-b83f-1799bf08bad0" (UID: "9f02add7-c3ef-4952-b83f-1799bf08bad0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.181317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9f02add7-c3ef-4952-b83f-1799bf08bad0" (UID: "9f02add7-c3ef-4952-b83f-1799bf08bad0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.183862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9f02add7-c3ef-4952-b83f-1799bf08bad0" (UID: "9f02add7-c3ef-4952-b83f-1799bf08bad0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.251243 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.251273 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.251286 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.251295 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b526m\" (UniqueName: \"kubernetes.io/projected/9f02add7-c3ef-4952-b83f-1799bf08bad0-kube-api-access-b526m\") on node \"crc\" DevicePath \"\"" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.251308 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.251318 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f02add7-c3ef-4952-b83f-1799bf08bad0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.596814 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" event={"ID":"9f02add7-c3ef-4952-b83f-1799bf08bad0","Type":"ContainerDied","Data":"d64c9a336f15473a2cc86984c68086680b5d84f8cc46fbe2c5df282622e86fcb"} Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.597235 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d64c9a336f15473a2cc86984c68086680b5d84f8cc46fbe2c5df282622e86fcb" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.597441 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.769037 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz"] Jan 29 04:02:05 crc kubenswrapper[4707]: E0129 04:02:05.769728 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerName="extract-utilities" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.769816 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerName="extract-utilities" Jan 29 04:02:05 crc kubenswrapper[4707]: E0129 04:02:05.769903 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerName="extract-content" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.769987 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerName="extract-content" Jan 29 04:02:05 crc kubenswrapper[4707]: E0129 04:02:05.770078 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerName="registry-server" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.770139 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerName="registry-server" Jan 29 04:02:05 crc kubenswrapper[4707]: E0129 04:02:05.770218 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f02add7-c3ef-4952-b83f-1799bf08bad0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.770276 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f02add7-c3ef-4952-b83f-1799bf08bad0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.770523 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcca404e-23d7-47d5-9598-d6f07e65759a" containerName="registry-server" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.770647 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f02add7-c3ef-4952-b83f-1799bf08bad0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.771369 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.773766 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.773797 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.774095 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.774284 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.775476 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.781390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz"] Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.865973 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x88m\" (UniqueName: \"kubernetes.io/projected/a019e4eb-4ee9-4426-bde0-9c6b0319283f-kube-api-access-6x88m\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.866032 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.866067 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.866108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.866189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.968950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x88m\" (UniqueName: \"kubernetes.io/projected/a019e4eb-4ee9-4426-bde0-9c6b0319283f-kube-api-access-6x88m\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.969267 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.969416 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.969588 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.969773 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.975324 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.981893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.981967 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.982674 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:05 crc kubenswrapper[4707]: I0129 04:02:05.984459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x88m\" (UniqueName: \"kubernetes.io/projected/a019e4eb-4ee9-4426-bde0-9c6b0319283f-kube-api-access-6x88m\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:06 crc kubenswrapper[4707]: I0129 04:02:06.108218 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:02:06 crc kubenswrapper[4707]: I0129 04:02:06.606015 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz"] Jan 29 04:02:07 crc kubenswrapper[4707]: I0129 04:02:07.616670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" event={"ID":"a019e4eb-4ee9-4426-bde0-9c6b0319283f","Type":"ContainerStarted","Data":"70c019a3a00458f002451371ecd9887bb1fb0a7b9f791a4ccea1b02e7dff3290"} Jan 29 04:02:07 crc kubenswrapper[4707]: I0129 04:02:07.617053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" event={"ID":"a019e4eb-4ee9-4426-bde0-9c6b0319283f","Type":"ContainerStarted","Data":"85bf56ed43799bde0762a30a31ec20b656e099cbdb8644e7b4512e0d846cff66"} Jan 29 04:02:07 crc kubenswrapper[4707]: I0129 04:02:07.640839 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" podStartSLOduration=2.177909592 podStartE2EDuration="2.64081286s" podCreationTimestamp="2026-01-29 04:02:05 +0000 UTC" firstStartedPulling="2026-01-29 04:02:06.607813081 +0000 UTC m=+2080.092042006" lastFinishedPulling="2026-01-29 04:02:07.070716369 +0000 UTC m=+2080.554945274" observedRunningTime="2026-01-29 04:02:07.631123166 +0000 UTC m=+2081.115352091" watchObservedRunningTime="2026-01-29 04:02:07.64081286 +0000 UTC m=+2081.125041775" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.405209 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gvjq7"] Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.409321 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.429042 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gvjq7"] Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.561982 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-catalog-content\") pod \"community-operators-gvjq7\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.562191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-utilities\") pod \"community-operators-gvjq7\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.562309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n5v6\" (UniqueName: \"kubernetes.io/projected/e2033951-d2c0-42c7-a145-8ebf4e150b3a-kube-api-access-2n5v6\") pod \"community-operators-gvjq7\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.664059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n5v6\" (UniqueName: \"kubernetes.io/projected/e2033951-d2c0-42c7-a145-8ebf4e150b3a-kube-api-access-2n5v6\") pod \"community-operators-gvjq7\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.664118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-catalog-content\") pod \"community-operators-gvjq7\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.664214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-utilities\") pod \"community-operators-gvjq7\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.664708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-catalog-content\") pod \"community-operators-gvjq7\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.664770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-utilities\") pod \"community-operators-gvjq7\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.688569 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n5v6\" (UniqueName: \"kubernetes.io/projected/e2033951-d2c0-42c7-a145-8ebf4e150b3a-kube-api-access-2n5v6\") pod \"community-operators-gvjq7\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:28 crc kubenswrapper[4707]: I0129 04:02:28.736043 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:29 crc kubenswrapper[4707]: I0129 04:02:29.270173 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gvjq7"] Jan 29 04:02:29 crc kubenswrapper[4707]: W0129 04:02:29.270674 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2033951_d2c0_42c7_a145_8ebf4e150b3a.slice/crio-a4f351cc13f5cb1cb4fac80807ee9936f942be262098a2cd1383e478c9963cc8 WatchSource:0}: Error finding container a4f351cc13f5cb1cb4fac80807ee9936f942be262098a2cd1383e478c9963cc8: Status 404 returned error can't find the container with id a4f351cc13f5cb1cb4fac80807ee9936f942be262098a2cd1383e478c9963cc8 Jan 29 04:02:29 crc kubenswrapper[4707]: I0129 04:02:29.842429 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerID="02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4" exitCode=0 Jan 29 04:02:29 crc kubenswrapper[4707]: I0129 04:02:29.842480 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvjq7" event={"ID":"e2033951-d2c0-42c7-a145-8ebf4e150b3a","Type":"ContainerDied","Data":"02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4"} Jan 29 04:02:29 crc kubenswrapper[4707]: I0129 04:02:29.842513 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvjq7" event={"ID":"e2033951-d2c0-42c7-a145-8ebf4e150b3a","Type":"ContainerStarted","Data":"a4f351cc13f5cb1cb4fac80807ee9936f942be262098a2cd1383e478c9963cc8"} Jan 29 04:02:30 crc kubenswrapper[4707]: I0129 04:02:30.853832 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvjq7" event={"ID":"e2033951-d2c0-42c7-a145-8ebf4e150b3a","Type":"ContainerStarted","Data":"9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55"} Jan 29 04:02:31 crc kubenswrapper[4707]: I0129 04:02:31.866432 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerID="9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55" exitCode=0 Jan 29 04:02:31 crc kubenswrapper[4707]: I0129 04:02:31.866586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvjq7" event={"ID":"e2033951-d2c0-42c7-a145-8ebf4e150b3a","Type":"ContainerDied","Data":"9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55"} Jan 29 04:02:32 crc kubenswrapper[4707]: I0129 04:02:32.878481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvjq7" event={"ID":"e2033951-d2c0-42c7-a145-8ebf4e150b3a","Type":"ContainerStarted","Data":"f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459"} Jan 29 04:02:32 crc kubenswrapper[4707]: I0129 04:02:32.904135 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gvjq7" podStartSLOduration=2.477806835 podStartE2EDuration="4.904115697s" podCreationTimestamp="2026-01-29 04:02:28 +0000 UTC" firstStartedPulling="2026-01-29 04:02:29.844739042 +0000 UTC m=+2103.328967947" lastFinishedPulling="2026-01-29 04:02:32.271047904 +0000 UTC m=+2105.755276809" observedRunningTime="2026-01-29 04:02:32.898552939 +0000 UTC m=+2106.382781854" watchObservedRunningTime="2026-01-29 04:02:32.904115697 +0000 UTC m=+2106.388344602" Jan 29 04:02:38 crc kubenswrapper[4707]: I0129 04:02:38.736983 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:38 crc kubenswrapper[4707]: I0129 04:02:38.737354 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:38 crc kubenswrapper[4707]: I0129 04:02:38.795753 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:38 crc kubenswrapper[4707]: I0129 04:02:38.986437 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:39 crc kubenswrapper[4707]: I0129 04:02:39.039524 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gvjq7"] Jan 29 04:02:40 crc kubenswrapper[4707]: I0129 04:02:40.947935 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gvjq7" podUID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerName="registry-server" containerID="cri-o://f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459" gracePeriod=2 Jan 29 04:02:41 crc kubenswrapper[4707]: I0129 04:02:41.888865 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:41 crc kubenswrapper[4707]: I0129 04:02:41.959001 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerID="f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459" exitCode=0 Jan 29 04:02:41 crc kubenswrapper[4707]: I0129 04:02:41.959051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvjq7" event={"ID":"e2033951-d2c0-42c7-a145-8ebf4e150b3a","Type":"ContainerDied","Data":"f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459"} Jan 29 04:02:41 crc kubenswrapper[4707]: I0129 04:02:41.959090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvjq7" event={"ID":"e2033951-d2c0-42c7-a145-8ebf4e150b3a","Type":"ContainerDied","Data":"a4f351cc13f5cb1cb4fac80807ee9936f942be262098a2cd1383e478c9963cc8"} Jan 29 04:02:41 crc kubenswrapper[4707]: I0129 04:02:41.959109 4707 scope.go:117] "RemoveContainer" containerID="f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459" Jan 29 04:02:41 crc kubenswrapper[4707]: I0129 04:02:41.959114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvjq7" Jan 29 04:02:41 crc kubenswrapper[4707]: I0129 04:02:41.976947 4707 scope.go:117] "RemoveContainer" containerID="9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55" Jan 29 04:02:41 crc kubenswrapper[4707]: I0129 04:02:41.998306 4707 scope.go:117] "RemoveContainer" containerID="02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.046168 4707 scope.go:117] "RemoveContainer" containerID="f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459" Jan 29 04:02:42 crc kubenswrapper[4707]: E0129 04:02:42.046636 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459\": container with ID starting with f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459 not found: ID does not exist" containerID="f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.046673 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459"} err="failed to get container status \"f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459\": rpc error: code = NotFound desc = could not find container \"f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459\": container with ID starting with f5f37c35c6a4d0ff236e26ae5a7b0a5b277490b5205a17ae01726a9227134459 not found: ID does not exist" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.046699 4707 scope.go:117] "RemoveContainer" containerID="9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55" Jan 29 04:02:42 crc kubenswrapper[4707]: E0129 04:02:42.046985 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55\": container with ID starting with 9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55 not found: ID does not exist" containerID="9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.047015 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55"} err="failed to get container status \"9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55\": rpc error: code = NotFound desc = could not find container \"9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55\": container with ID starting with 9c4b56a42e2707c04267abf071e859aee88bdca3e3a0bd360da50ef30e78dc55 not found: ID does not exist" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.047033 4707 scope.go:117] "RemoveContainer" containerID="02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4" Jan 29 04:02:42 crc kubenswrapper[4707]: E0129 04:02:42.047380 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4\": container with ID starting with 02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4 not found: ID does not exist" containerID="02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.047405 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4"} err="failed to get container status \"02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4\": rpc error: code = NotFound desc = could not find container \"02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4\": container with ID starting with 02d6e36030c5c4b86a71579c4cf182321d6902ffc7abd1f6f12678339eca0af4 not found: ID does not exist" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.083986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n5v6\" (UniqueName: \"kubernetes.io/projected/e2033951-d2c0-42c7-a145-8ebf4e150b3a-kube-api-access-2n5v6\") pod \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.084054 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-catalog-content\") pod \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.084145 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-utilities\") pod \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\" (UID: \"e2033951-d2c0-42c7-a145-8ebf4e150b3a\") " Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.085841 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-utilities" (OuterVolumeSpecName: "utilities") pod "e2033951-d2c0-42c7-a145-8ebf4e150b3a" (UID: "e2033951-d2c0-42c7-a145-8ebf4e150b3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.090114 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2033951-d2c0-42c7-a145-8ebf4e150b3a-kube-api-access-2n5v6" (OuterVolumeSpecName: "kube-api-access-2n5v6") pod "e2033951-d2c0-42c7-a145-8ebf4e150b3a" (UID: "e2033951-d2c0-42c7-a145-8ebf4e150b3a"). InnerVolumeSpecName "kube-api-access-2n5v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.137610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2033951-d2c0-42c7-a145-8ebf4e150b3a" (UID: "e2033951-d2c0-42c7-a145-8ebf4e150b3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.187529 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n5v6\" (UniqueName: \"kubernetes.io/projected/e2033951-d2c0-42c7-a145-8ebf4e150b3a-kube-api-access-2n5v6\") on node \"crc\" DevicePath \"\"" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.187589 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.187599 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2033951-d2c0-42c7-a145-8ebf4e150b3a-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.293702 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gvjq7"] Jan 29 04:02:42 crc kubenswrapper[4707]: I0129 04:02:42.301266 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gvjq7"] Jan 29 04:02:43 crc kubenswrapper[4707]: I0129 04:02:43.259206 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" path="/var/lib/kubelet/pods/e2033951-d2c0-42c7-a145-8ebf4e150b3a/volumes" Jan 29 04:03:33 crc kubenswrapper[4707]: I0129 04:03:33.463317 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:03:33 crc kubenswrapper[4707]: I0129 04:03:33.464003 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:04:03 crc kubenswrapper[4707]: I0129 04:04:03.463684 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:04:03 crc kubenswrapper[4707]: I0129 04:04:03.464685 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:04:21 crc kubenswrapper[4707]: I0129 04:04:21.899802 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bbgf9"] Jan 29 04:04:21 crc kubenswrapper[4707]: E0129 04:04:21.901132 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerName="registry-server" Jan 29 04:04:21 crc kubenswrapper[4707]: I0129 04:04:21.901150 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerName="registry-server" Jan 29 04:04:21 crc kubenswrapper[4707]: E0129 04:04:21.901168 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerName="extract-utilities" Jan 29 04:04:21 crc kubenswrapper[4707]: I0129 04:04:21.901175 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerName="extract-utilities" Jan 29 04:04:21 crc kubenswrapper[4707]: E0129 04:04:21.901200 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerName="extract-content" Jan 29 04:04:21 crc kubenswrapper[4707]: I0129 04:04:21.901206 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerName="extract-content" Jan 29 04:04:21 crc kubenswrapper[4707]: I0129 04:04:21.901404 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2033951-d2c0-42c7-a145-8ebf4e150b3a" containerName="registry-server" Jan 29 04:04:21 crc kubenswrapper[4707]: I0129 04:04:21.902982 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:21 crc kubenswrapper[4707]: I0129 04:04:21.912528 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbgf9"] Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.054915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-catalog-content\") pod \"redhat-marketplace-bbgf9\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.055008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-utilities\") pod \"redhat-marketplace-bbgf9\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.055429 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnhdt\" (UniqueName: \"kubernetes.io/projected/db8cffde-d810-465c-aaa7-81ea93d6dbdc-kube-api-access-jnhdt\") pod \"redhat-marketplace-bbgf9\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.158609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-catalog-content\") pod \"redhat-marketplace-bbgf9\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.158710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-utilities\") pod \"redhat-marketplace-bbgf9\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.158780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnhdt\" (UniqueName: \"kubernetes.io/projected/db8cffde-d810-465c-aaa7-81ea93d6dbdc-kube-api-access-jnhdt\") pod \"redhat-marketplace-bbgf9\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.159291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-catalog-content\") pod \"redhat-marketplace-bbgf9\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.159470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-utilities\") pod \"redhat-marketplace-bbgf9\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.181498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnhdt\" (UniqueName: \"kubernetes.io/projected/db8cffde-d810-465c-aaa7-81ea93d6dbdc-kube-api-access-jnhdt\") pod \"redhat-marketplace-bbgf9\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.232852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.814757 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbgf9"] Jan 29 04:04:22 crc kubenswrapper[4707]: I0129 04:04:22.884154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbgf9" event={"ID":"db8cffde-d810-465c-aaa7-81ea93d6dbdc","Type":"ContainerStarted","Data":"d616eadb6af7ea4b762f8bd40e78fa7aa256e5ed7853e9442fc556d01d4c5b66"} Jan 29 04:04:23 crc kubenswrapper[4707]: I0129 04:04:23.899798 4707 generic.go:334] "Generic (PLEG): container finished" podID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerID="0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084" exitCode=0 Jan 29 04:04:23 crc kubenswrapper[4707]: I0129 04:04:23.900013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbgf9" event={"ID":"db8cffde-d810-465c-aaa7-81ea93d6dbdc","Type":"ContainerDied","Data":"0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084"} Jan 29 04:04:24 crc kubenswrapper[4707]: I0129 04:04:24.913862 4707 generic.go:334] "Generic (PLEG): container finished" podID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerID="07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe" exitCode=0 Jan 29 04:04:24 crc kubenswrapper[4707]: I0129 04:04:24.913930 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbgf9" event={"ID":"db8cffde-d810-465c-aaa7-81ea93d6dbdc","Type":"ContainerDied","Data":"07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe"} Jan 29 04:04:25 crc kubenswrapper[4707]: I0129 04:04:25.927070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbgf9" event={"ID":"db8cffde-d810-465c-aaa7-81ea93d6dbdc","Type":"ContainerStarted","Data":"cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e"} Jan 29 04:04:25 crc kubenswrapper[4707]: I0129 04:04:25.960419 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bbgf9" podStartSLOduration=3.424375838 podStartE2EDuration="4.96039491s" podCreationTimestamp="2026-01-29 04:04:21 +0000 UTC" firstStartedPulling="2026-01-29 04:04:23.903626394 +0000 UTC m=+2217.387855309" lastFinishedPulling="2026-01-29 04:04:25.439645476 +0000 UTC m=+2218.923874381" observedRunningTime="2026-01-29 04:04:25.944651464 +0000 UTC m=+2219.428880389" watchObservedRunningTime="2026-01-29 04:04:25.96039491 +0000 UTC m=+2219.444623825" Jan 29 04:04:32 crc kubenswrapper[4707]: I0129 04:04:32.233423 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:32 crc kubenswrapper[4707]: I0129 04:04:32.234045 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:32 crc kubenswrapper[4707]: I0129 04:04:32.282674 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:33 crc kubenswrapper[4707]: I0129 04:04:33.037163 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:33 crc kubenswrapper[4707]: I0129 04:04:33.080991 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbgf9"] Jan 29 04:04:33 crc kubenswrapper[4707]: I0129 04:04:33.463994 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:04:33 crc kubenswrapper[4707]: I0129 04:04:33.464079 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:04:33 crc kubenswrapper[4707]: I0129 04:04:33.464128 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 04:04:33 crc kubenswrapper[4707]: I0129 04:04:33.464989 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 04:04:33 crc kubenswrapper[4707]: I0129 04:04:33.465051 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" gracePeriod=600 Jan 29 04:04:33 crc kubenswrapper[4707]: E0129 04:04:33.596634 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:04:34 crc kubenswrapper[4707]: I0129 04:04:34.006325 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" exitCode=0 Jan 29 04:04:34 crc kubenswrapper[4707]: I0129 04:04:34.006434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024"} Jan 29 04:04:34 crc kubenswrapper[4707]: I0129 04:04:34.006575 4707 scope.go:117] "RemoveContainer" containerID="9075a066370335e7cc1a1ac646b318a6399393468866cff65b48a27e1e3b8714" Jan 29 04:04:34 crc kubenswrapper[4707]: I0129 04:04:34.007716 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:04:34 crc kubenswrapper[4707]: E0129 04:04:34.008223 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.018832 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bbgf9" podUID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerName="registry-server" containerID="cri-o://cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e" gracePeriod=2 Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.472930 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.521807 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-catalog-content\") pod \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.521954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-utilities\") pod \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.522054 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnhdt\" (UniqueName: \"kubernetes.io/projected/db8cffde-d810-465c-aaa7-81ea93d6dbdc-kube-api-access-jnhdt\") pod \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\" (UID: \"db8cffde-d810-465c-aaa7-81ea93d6dbdc\") " Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.523714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-utilities" (OuterVolumeSpecName: "utilities") pod "db8cffde-d810-465c-aaa7-81ea93d6dbdc" (UID: "db8cffde-d810-465c-aaa7-81ea93d6dbdc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.529848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8cffde-d810-465c-aaa7-81ea93d6dbdc-kube-api-access-jnhdt" (OuterVolumeSpecName: "kube-api-access-jnhdt") pod "db8cffde-d810-465c-aaa7-81ea93d6dbdc" (UID: "db8cffde-d810-465c-aaa7-81ea93d6dbdc"). InnerVolumeSpecName "kube-api-access-jnhdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.545828 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db8cffde-d810-465c-aaa7-81ea93d6dbdc" (UID: "db8cffde-d810-465c-aaa7-81ea93d6dbdc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.624746 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.624777 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnhdt\" (UniqueName: \"kubernetes.io/projected/db8cffde-d810-465c-aaa7-81ea93d6dbdc-kube-api-access-jnhdt\") on node \"crc\" DevicePath \"\"" Jan 29 04:04:35 crc kubenswrapper[4707]: I0129 04:04:35.624788 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cffde-d810-465c-aaa7-81ea93d6dbdc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.030236 4707 generic.go:334] "Generic (PLEG): container finished" podID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerID="cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e" exitCode=0 Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.030287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbgf9" event={"ID":"db8cffde-d810-465c-aaa7-81ea93d6dbdc","Type":"ContainerDied","Data":"cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e"} Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.030318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbgf9" event={"ID":"db8cffde-d810-465c-aaa7-81ea93d6dbdc","Type":"ContainerDied","Data":"d616eadb6af7ea4b762f8bd40e78fa7aa256e5ed7853e9442fc556d01d4c5b66"} Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.030337 4707 scope.go:117] "RemoveContainer" containerID="cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.030520 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbgf9" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.059889 4707 scope.go:117] "RemoveContainer" containerID="07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.067663 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbgf9"] Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.075233 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbgf9"] Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.101418 4707 scope.go:117] "RemoveContainer" containerID="0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.150906 4707 scope.go:117] "RemoveContainer" containerID="cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e" Jan 29 04:04:36 crc kubenswrapper[4707]: E0129 04:04:36.151844 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e\": container with ID starting with cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e not found: ID does not exist" containerID="cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.151885 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e"} err="failed to get container status \"cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e\": rpc error: code = NotFound desc = could not find container \"cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e\": container with ID starting with cd08db41fc14f46c8e015ab558e3ec56d3891d9080b3923487a6d9ac86e24e1e not found: ID does not exist" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.151913 4707 scope.go:117] "RemoveContainer" containerID="07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe" Jan 29 04:04:36 crc kubenswrapper[4707]: E0129 04:04:36.152382 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe\": container with ID starting with 07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe not found: ID does not exist" containerID="07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.152462 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe"} err="failed to get container status \"07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe\": rpc error: code = NotFound desc = could not find container \"07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe\": container with ID starting with 07c84d9b0d11ff685139f5842c1c2ac92d7168cd75bc1eeea9f75dc0085699fe not found: ID does not exist" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.152514 4707 scope.go:117] "RemoveContainer" containerID="0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084" Jan 29 04:04:36 crc kubenswrapper[4707]: E0129 04:04:36.154018 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084\": container with ID starting with 0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084 not found: ID does not exist" containerID="0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084" Jan 29 04:04:36 crc kubenswrapper[4707]: I0129 04:04:36.154051 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084"} err="failed to get container status \"0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084\": rpc error: code = NotFound desc = could not find container \"0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084\": container with ID starting with 0e47b101d84bef446dde18891aecf55b813814ce15b87845d614ded2503ea084 not found: ID does not exist" Jan 29 04:04:37 crc kubenswrapper[4707]: I0129 04:04:37.256286 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" path="/var/lib/kubelet/pods/db8cffde-d810-465c-aaa7-81ea93d6dbdc/volumes" Jan 29 04:04:48 crc kubenswrapper[4707]: I0129 04:04:48.243858 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:04:48 crc kubenswrapper[4707]: E0129 04:04:48.244824 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:05:00 crc kubenswrapper[4707]: I0129 04:05:00.244509 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:05:00 crc kubenswrapper[4707]: E0129 04:05:00.246657 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:05:15 crc kubenswrapper[4707]: I0129 04:05:15.244388 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:05:15 crc kubenswrapper[4707]: E0129 04:05:15.245313 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:05:27 crc kubenswrapper[4707]: I0129 04:05:27.250345 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:05:27 crc kubenswrapper[4707]: E0129 04:05:27.251706 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:05:42 crc kubenswrapper[4707]: I0129 04:05:42.244228 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:05:42 crc kubenswrapper[4707]: E0129 04:05:42.246041 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:05:53 crc kubenswrapper[4707]: I0129 04:05:53.130918 4707 generic.go:334] "Generic (PLEG): container finished" podID="a019e4eb-4ee9-4426-bde0-9c6b0319283f" containerID="70c019a3a00458f002451371ecd9887bb1fb0a7b9f791a4ccea1b02e7dff3290" exitCode=0 Jan 29 04:05:53 crc kubenswrapper[4707]: I0129 04:05:53.130993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" event={"ID":"a019e4eb-4ee9-4426-bde0-9c6b0319283f","Type":"ContainerDied","Data":"70c019a3a00458f002451371ecd9887bb1fb0a7b9f791a4ccea1b02e7dff3290"} Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.572755 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.671761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-secret-0\") pod \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.671850 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x88m\" (UniqueName: \"kubernetes.io/projected/a019e4eb-4ee9-4426-bde0-9c6b0319283f-kube-api-access-6x88m\") pod \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.671921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-combined-ca-bundle\") pod \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.672012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-ssh-key-openstack-edpm-ipam\") pod \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.672042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-inventory\") pod \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\" (UID: \"a019e4eb-4ee9-4426-bde0-9c6b0319283f\") " Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.679469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a019e4eb-4ee9-4426-bde0-9c6b0319283f-kube-api-access-6x88m" (OuterVolumeSpecName: "kube-api-access-6x88m") pod "a019e4eb-4ee9-4426-bde0-9c6b0319283f" (UID: "a019e4eb-4ee9-4426-bde0-9c6b0319283f"). InnerVolumeSpecName "kube-api-access-6x88m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.680618 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a019e4eb-4ee9-4426-bde0-9c6b0319283f" (UID: "a019e4eb-4ee9-4426-bde0-9c6b0319283f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.701324 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a019e4eb-4ee9-4426-bde0-9c6b0319283f" (UID: "a019e4eb-4ee9-4426-bde0-9c6b0319283f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.702715 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a019e4eb-4ee9-4426-bde0-9c6b0319283f" (UID: "a019e4eb-4ee9-4426-bde0-9c6b0319283f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.707783 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-inventory" (OuterVolumeSpecName: "inventory") pod "a019e4eb-4ee9-4426-bde0-9c6b0319283f" (UID: "a019e4eb-4ee9-4426-bde0-9c6b0319283f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.774513 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.774571 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x88m\" (UniqueName: \"kubernetes.io/projected/a019e4eb-4ee9-4426-bde0-9c6b0319283f-kube-api-access-6x88m\") on node \"crc\" DevicePath \"\"" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.774585 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.774593 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 04:05:54 crc kubenswrapper[4707]: I0129 04:05:54.774603 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a019e4eb-4ee9-4426-bde0-9c6b0319283f-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.169272 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" event={"ID":"a019e4eb-4ee9-4426-bde0-9c6b0319283f","Type":"ContainerDied","Data":"85bf56ed43799bde0762a30a31ec20b656e099cbdb8644e7b4512e0d846cff66"} Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.169400 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.170707 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85bf56ed43799bde0762a30a31ec20b656e099cbdb8644e7b4512e0d846cff66" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.244036 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:05:55 crc kubenswrapper[4707]: E0129 04:05:55.244400 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.260781 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk"] Jan 29 04:05:55 crc kubenswrapper[4707]: E0129 04:05:55.261298 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a019e4eb-4ee9-4426-bde0-9c6b0319283f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.261327 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a019e4eb-4ee9-4426-bde0-9c6b0319283f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 29 04:05:55 crc kubenswrapper[4707]: E0129 04:05:55.261343 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerName="registry-server" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.261350 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerName="registry-server" Jan 29 04:05:55 crc kubenswrapper[4707]: E0129 04:05:55.261383 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerName="extract-content" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.261389 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerName="extract-content" Jan 29 04:05:55 crc kubenswrapper[4707]: E0129 04:05:55.261409 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerName="extract-utilities" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.261416 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerName="extract-utilities" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.261819 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a019e4eb-4ee9-4426-bde0-9c6b0319283f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.261840 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8cffde-d810-465c-aaa7-81ea93d6dbdc" containerName="registry-server" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.262623 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.265510 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.265847 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.266950 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.267683 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.268077 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.268216 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.270134 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.272269 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk"] Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.388278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.388957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.389056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.389222 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.389248 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.389278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.389338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.389381 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lftn6\" (UniqueName: \"kubernetes.io/projected/018b06ef-5822-4b5e-ae32-43bf56e40f19-kube-api-access-lftn6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.389627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.491896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.492250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.492387 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.492466 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.492555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.492643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.492728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lftn6\" (UniqueName: \"kubernetes.io/projected/018b06ef-5822-4b5e-ae32-43bf56e40f19-kube-api-access-lftn6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.492832 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.492935 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.493359 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.503240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.503393 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.503716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.506229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.507094 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.507236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.507479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.527328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lftn6\" (UniqueName: \"kubernetes.io/projected/018b06ef-5822-4b5e-ae32-43bf56e40f19-kube-api-access-lftn6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2b5nk\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:55 crc kubenswrapper[4707]: I0129 04:05:55.599618 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:05:56 crc kubenswrapper[4707]: I0129 04:05:56.167462 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk"] Jan 29 04:05:56 crc kubenswrapper[4707]: W0129 04:05:56.168179 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod018b06ef_5822_4b5e_ae32_43bf56e40f19.slice/crio-df27ee1864fdb06aa7796d62ac7f5d4e34e2c1ec79ed4685f14dd142ea9edeee WatchSource:0}: Error finding container df27ee1864fdb06aa7796d62ac7f5d4e34e2c1ec79ed4685f14dd142ea9edeee: Status 404 returned error can't find the container with id df27ee1864fdb06aa7796d62ac7f5d4e34e2c1ec79ed4685f14dd142ea9edeee Jan 29 04:05:56 crc kubenswrapper[4707]: I0129 04:05:56.172461 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 04:05:56 crc kubenswrapper[4707]: I0129 04:05:56.177843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" event={"ID":"018b06ef-5822-4b5e-ae32-43bf56e40f19","Type":"ContainerStarted","Data":"df27ee1864fdb06aa7796d62ac7f5d4e34e2c1ec79ed4685f14dd142ea9edeee"} Jan 29 04:05:57 crc kubenswrapper[4707]: I0129 04:05:57.186775 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" event={"ID":"018b06ef-5822-4b5e-ae32-43bf56e40f19","Type":"ContainerStarted","Data":"0a585251c0c687c345bda78991893b23bba10d38a39629db44dd196a27a27d85"} Jan 29 04:05:57 crc kubenswrapper[4707]: I0129 04:05:57.209871 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" podStartSLOduration=1.726886556 podStartE2EDuration="2.209849089s" podCreationTimestamp="2026-01-29 04:05:55 +0000 UTC" firstStartedPulling="2026-01-29 04:05:56.172079039 +0000 UTC m=+2309.656307944" lastFinishedPulling="2026-01-29 04:05:56.655041572 +0000 UTC m=+2310.139270477" observedRunningTime="2026-01-29 04:05:57.204620552 +0000 UTC m=+2310.688849467" watchObservedRunningTime="2026-01-29 04:05:57.209849089 +0000 UTC m=+2310.694077994" Jan 29 04:06:06 crc kubenswrapper[4707]: I0129 04:06:06.243845 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:06:06 crc kubenswrapper[4707]: E0129 04:06:06.244485 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.610297 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zd5hn"] Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.613294 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.627103 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zd5hn"] Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.790108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bgpm\" (UniqueName: \"kubernetes.io/projected/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-kube-api-access-7bgpm\") pod \"certified-operators-zd5hn\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.790411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-utilities\") pod \"certified-operators-zd5hn\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.790506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-catalog-content\") pod \"certified-operators-zd5hn\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.893069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-catalog-content\") pod \"certified-operators-zd5hn\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.893297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bgpm\" (UniqueName: \"kubernetes.io/projected/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-kube-api-access-7bgpm\") pod \"certified-operators-zd5hn\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.893457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-utilities\") pod \"certified-operators-zd5hn\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.893761 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-catalog-content\") pod \"certified-operators-zd5hn\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.894160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-utilities\") pod \"certified-operators-zd5hn\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.916931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bgpm\" (UniqueName: \"kubernetes.io/projected/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-kube-api-access-7bgpm\") pod \"certified-operators-zd5hn\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:14 crc kubenswrapper[4707]: I0129 04:06:14.971822 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:15 crc kubenswrapper[4707]: I0129 04:06:15.481774 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zd5hn"] Jan 29 04:06:16 crc kubenswrapper[4707]: I0129 04:06:16.397954 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerID="341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32" exitCode=0 Jan 29 04:06:16 crc kubenswrapper[4707]: I0129 04:06:16.398056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd5hn" event={"ID":"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd","Type":"ContainerDied","Data":"341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32"} Jan 29 04:06:16 crc kubenswrapper[4707]: I0129 04:06:16.398418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd5hn" event={"ID":"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd","Type":"ContainerStarted","Data":"e0563b511ec86f224feb6d1ad1e06b854c86826c891872324db18ff32f9401b8"} Jan 29 04:06:17 crc kubenswrapper[4707]: I0129 04:06:17.414627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd5hn" event={"ID":"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd","Type":"ContainerStarted","Data":"c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887"} Jan 29 04:06:18 crc kubenswrapper[4707]: I0129 04:06:18.424952 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerID="c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887" exitCode=0 Jan 29 04:06:18 crc kubenswrapper[4707]: I0129 04:06:18.425002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd5hn" event={"ID":"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd","Type":"ContainerDied","Data":"c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887"} Jan 29 04:06:19 crc kubenswrapper[4707]: I0129 04:06:19.435749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd5hn" event={"ID":"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd","Type":"ContainerStarted","Data":"99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35"} Jan 29 04:06:19 crc kubenswrapper[4707]: I0129 04:06:19.465103 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zd5hn" podStartSLOduration=3.030133631 podStartE2EDuration="5.465085192s" podCreationTimestamp="2026-01-29 04:06:14 +0000 UTC" firstStartedPulling="2026-01-29 04:06:16.399784304 +0000 UTC m=+2329.884013229" lastFinishedPulling="2026-01-29 04:06:18.834735885 +0000 UTC m=+2332.318964790" observedRunningTime="2026-01-29 04:06:19.453913787 +0000 UTC m=+2332.938142692" watchObservedRunningTime="2026-01-29 04:06:19.465085192 +0000 UTC m=+2332.949314097" Jan 29 04:06:21 crc kubenswrapper[4707]: I0129 04:06:21.244596 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:06:21 crc kubenswrapper[4707]: E0129 04:06:21.245032 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:06:24 crc kubenswrapper[4707]: I0129 04:06:24.972441 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:24 crc kubenswrapper[4707]: I0129 04:06:24.973118 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:25 crc kubenswrapper[4707]: I0129 04:06:25.024181 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:25 crc kubenswrapper[4707]: I0129 04:06:25.547697 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:26 crc kubenswrapper[4707]: I0129 04:06:26.670631 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zd5hn"] Jan 29 04:06:27 crc kubenswrapper[4707]: I0129 04:06:27.522946 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zd5hn" podUID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerName="registry-server" containerID="cri-o://99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35" gracePeriod=2 Jan 29 04:06:27 crc kubenswrapper[4707]: I0129 04:06:27.965448 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.076010 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-utilities\") pod \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.076352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bgpm\" (UniqueName: \"kubernetes.io/projected/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-kube-api-access-7bgpm\") pod \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.076641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-catalog-content\") pod \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\" (UID: \"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd\") " Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.077730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-utilities" (OuterVolumeSpecName: "utilities") pod "9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" (UID: "9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.083704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-kube-api-access-7bgpm" (OuterVolumeSpecName: "kube-api-access-7bgpm") pod "9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" (UID: "9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd"). InnerVolumeSpecName "kube-api-access-7bgpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.123771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" (UID: "9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.179237 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.179706 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bgpm\" (UniqueName: \"kubernetes.io/projected/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-kube-api-access-7bgpm\") on node \"crc\" DevicePath \"\"" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.179717 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.556761 4707 generic.go:334] "Generic (PLEG): container finished" podID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerID="99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35" exitCode=0 Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.556836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd5hn" event={"ID":"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd","Type":"ContainerDied","Data":"99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35"} Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.556886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd5hn" event={"ID":"9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd","Type":"ContainerDied","Data":"e0563b511ec86f224feb6d1ad1e06b854c86826c891872324db18ff32f9401b8"} Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.556916 4707 scope.go:117] "RemoveContainer" containerID="99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.557128 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd5hn" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.609758 4707 scope.go:117] "RemoveContainer" containerID="c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.610787 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zd5hn"] Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.641979 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zd5hn"] Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.647427 4707 scope.go:117] "RemoveContainer" containerID="341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.701303 4707 scope.go:117] "RemoveContainer" containerID="99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35" Jan 29 04:06:28 crc kubenswrapper[4707]: E0129 04:06:28.703040 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35\": container with ID starting with 99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35 not found: ID does not exist" containerID="99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.703103 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35"} err="failed to get container status \"99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35\": rpc error: code = NotFound desc = could not find container \"99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35\": container with ID starting with 99c1d1c7e703c038a9934e0ab99aa230e9a7d0ce3d21294a1336a99d51031e35 not found: ID does not exist" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.703142 4707 scope.go:117] "RemoveContainer" containerID="c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887" Jan 29 04:06:28 crc kubenswrapper[4707]: E0129 04:06:28.703639 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887\": container with ID starting with c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887 not found: ID does not exist" containerID="c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.703676 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887"} err="failed to get container status \"c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887\": rpc error: code = NotFound desc = could not find container \"c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887\": container with ID starting with c2fd7f554f4342f7c6cda8ee56a6c4ae17e9b2a32988999920c95e184c714887 not found: ID does not exist" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.703707 4707 scope.go:117] "RemoveContainer" containerID="341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32" Jan 29 04:06:28 crc kubenswrapper[4707]: E0129 04:06:28.704317 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32\": container with ID starting with 341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32 not found: ID does not exist" containerID="341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32" Jan 29 04:06:28 crc kubenswrapper[4707]: I0129 04:06:28.704375 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32"} err="failed to get container status \"341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32\": rpc error: code = NotFound desc = could not find container \"341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32\": container with ID starting with 341ff8434695ec7fe167a36a45cdbf2a59637762e1e711b2c507498ddca7da32 not found: ID does not exist" Jan 29 04:06:29 crc kubenswrapper[4707]: I0129 04:06:29.257675 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" path="/var/lib/kubelet/pods/9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd/volumes" Jan 29 04:06:35 crc kubenswrapper[4707]: I0129 04:06:35.244362 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:06:35 crc kubenswrapper[4707]: E0129 04:06:35.245217 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:06:46 crc kubenswrapper[4707]: I0129 04:06:46.243234 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:06:46 crc kubenswrapper[4707]: E0129 04:06:46.244166 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:06:59 crc kubenswrapper[4707]: I0129 04:06:59.244459 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:06:59 crc kubenswrapper[4707]: E0129 04:06:59.245289 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:07:11 crc kubenswrapper[4707]: I0129 04:07:11.244043 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:07:11 crc kubenswrapper[4707]: E0129 04:07:11.244677 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:07:26 crc kubenswrapper[4707]: I0129 04:07:26.243674 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:07:26 crc kubenswrapper[4707]: E0129 04:07:26.244602 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:07:37 crc kubenswrapper[4707]: I0129 04:07:37.251657 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:07:37 crc kubenswrapper[4707]: E0129 04:07:37.252655 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:07:49 crc kubenswrapper[4707]: I0129 04:07:49.244078 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:07:49 crc kubenswrapper[4707]: E0129 04:07:49.246153 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:08:02 crc kubenswrapper[4707]: I0129 04:08:02.246595 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:08:02 crc kubenswrapper[4707]: E0129 04:08:02.249999 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:08:12 crc kubenswrapper[4707]: I0129 04:08:12.526416 4707 generic.go:334] "Generic (PLEG): container finished" podID="018b06ef-5822-4b5e-ae32-43bf56e40f19" containerID="0a585251c0c687c345bda78991893b23bba10d38a39629db44dd196a27a27d85" exitCode=0 Jan 29 04:08:12 crc kubenswrapper[4707]: I0129 04:08:12.526521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" event={"ID":"018b06ef-5822-4b5e-ae32-43bf56e40f19","Type":"ContainerDied","Data":"0a585251c0c687c345bda78991893b23bba10d38a39629db44dd196a27a27d85"} Jan 29 04:08:13 crc kubenswrapper[4707]: I0129 04:08:13.974719 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.021759 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-0\") pod \"018b06ef-5822-4b5e-ae32-43bf56e40f19\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.022482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-1\") pod \"018b06ef-5822-4b5e-ae32-43bf56e40f19\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.022534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-0\") pod \"018b06ef-5822-4b5e-ae32-43bf56e40f19\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.022614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-combined-ca-bundle\") pod \"018b06ef-5822-4b5e-ae32-43bf56e40f19\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.022702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-extra-config-0\") pod \"018b06ef-5822-4b5e-ae32-43bf56e40f19\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.023490 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-ssh-key-openstack-edpm-ipam\") pod \"018b06ef-5822-4b5e-ae32-43bf56e40f19\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.023604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-1\") pod \"018b06ef-5822-4b5e-ae32-43bf56e40f19\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.023697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-inventory\") pod \"018b06ef-5822-4b5e-ae32-43bf56e40f19\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.023763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lftn6\" (UniqueName: \"kubernetes.io/projected/018b06ef-5822-4b5e-ae32-43bf56e40f19-kube-api-access-lftn6\") pod \"018b06ef-5822-4b5e-ae32-43bf56e40f19\" (UID: \"018b06ef-5822-4b5e-ae32-43bf56e40f19\") " Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.029726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018b06ef-5822-4b5e-ae32-43bf56e40f19-kube-api-access-lftn6" (OuterVolumeSpecName: "kube-api-access-lftn6") pod "018b06ef-5822-4b5e-ae32-43bf56e40f19" (UID: "018b06ef-5822-4b5e-ae32-43bf56e40f19"). InnerVolumeSpecName "kube-api-access-lftn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.030890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "018b06ef-5822-4b5e-ae32-43bf56e40f19" (UID: "018b06ef-5822-4b5e-ae32-43bf56e40f19"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.055086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "018b06ef-5822-4b5e-ae32-43bf56e40f19" (UID: "018b06ef-5822-4b5e-ae32-43bf56e40f19"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.057420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "018b06ef-5822-4b5e-ae32-43bf56e40f19" (UID: "018b06ef-5822-4b5e-ae32-43bf56e40f19"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.058970 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "018b06ef-5822-4b5e-ae32-43bf56e40f19" (UID: "018b06ef-5822-4b5e-ae32-43bf56e40f19"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.060850 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "018b06ef-5822-4b5e-ae32-43bf56e40f19" (UID: "018b06ef-5822-4b5e-ae32-43bf56e40f19"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.067714 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "018b06ef-5822-4b5e-ae32-43bf56e40f19" (UID: "018b06ef-5822-4b5e-ae32-43bf56e40f19"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.077019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "018b06ef-5822-4b5e-ae32-43bf56e40f19" (UID: "018b06ef-5822-4b5e-ae32-43bf56e40f19"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.081561 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-inventory" (OuterVolumeSpecName: "inventory") pod "018b06ef-5822-4b5e-ae32-43bf56e40f19" (UID: "018b06ef-5822-4b5e-ae32-43bf56e40f19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.126081 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.126117 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.126129 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.126143 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lftn6\" (UniqueName: \"kubernetes.io/projected/018b06ef-5822-4b5e-ae32-43bf56e40f19-kube-api-access-lftn6\") on node \"crc\" DevicePath \"\"" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.126156 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.126168 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.126180 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.126190 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.126200 4707 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/018b06ef-5822-4b5e-ae32-43bf56e40f19-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.543881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" event={"ID":"018b06ef-5822-4b5e-ae32-43bf56e40f19","Type":"ContainerDied","Data":"df27ee1864fdb06aa7796d62ac7f5d4e34e2c1ec79ed4685f14dd142ea9edeee"} Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.543925 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df27ee1864fdb06aa7796d62ac7f5d4e34e2c1ec79ed4685f14dd142ea9edeee" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.543942 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2b5nk" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.659569 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r"] Jan 29 04:08:14 crc kubenswrapper[4707]: E0129 04:08:14.659966 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerName="registry-server" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.659984 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerName="registry-server" Jan 29 04:08:14 crc kubenswrapper[4707]: E0129 04:08:14.659996 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018b06ef-5822-4b5e-ae32-43bf56e40f19" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.660003 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="018b06ef-5822-4b5e-ae32-43bf56e40f19" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 29 04:08:14 crc kubenswrapper[4707]: E0129 04:08:14.660011 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerName="extract-utilities" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.660017 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerName="extract-utilities" Jan 29 04:08:14 crc kubenswrapper[4707]: E0129 04:08:14.660032 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerName="extract-content" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.660039 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerName="extract-content" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.660205 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="018b06ef-5822-4b5e-ae32-43bf56e40f19" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.660227 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d167e87-32b4-4ae3-90cf-5b7dcf0d20dd" containerName="registry-server" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.660850 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.663250 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.663934 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tnd6f" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.664154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.664315 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.664495 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.675391 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r"] Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.737026 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.737102 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.737343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.737493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.737786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.738193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.738368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgp7v\" (UniqueName: \"kubernetes.io/projected/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-kube-api-access-rgp7v\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.840312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.840434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.840453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.840575 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.840645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.840674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgp7v\" (UniqueName: \"kubernetes.io/projected/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-kube-api-access-rgp7v\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.840718 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.845361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.845491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.845545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.845836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.845909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.846507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.858905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgp7v\" (UniqueName: \"kubernetes.io/projected/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-kube-api-access-rgp7v\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:14 crc kubenswrapper[4707]: I0129 04:08:14.977246 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:08:15 crc kubenswrapper[4707]: I0129 04:08:15.635262 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r"] Jan 29 04:08:16 crc kubenswrapper[4707]: I0129 04:08:16.245589 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:08:16 crc kubenswrapper[4707]: E0129 04:08:16.246276 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:08:16 crc kubenswrapper[4707]: I0129 04:08:16.562671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" event={"ID":"b8c300e6-01c5-493d-b263-2b6cdfaba0c9","Type":"ContainerStarted","Data":"f33f5a75eb4ef9972107f7a6b561467d65eeec82092bbb5e9121f5c7cd5f7510"} Jan 29 04:08:16 crc kubenswrapper[4707]: I0129 04:08:16.562713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" event={"ID":"b8c300e6-01c5-493d-b263-2b6cdfaba0c9","Type":"ContainerStarted","Data":"f8c59c0c6c6aaef32e77a31d181bb7048fdcc1e0710e17aa6df708c328fdfcd7"} Jan 29 04:08:16 crc kubenswrapper[4707]: I0129 04:08:16.585577 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" podStartSLOduration=2.083481941 podStartE2EDuration="2.585549012s" podCreationTimestamp="2026-01-29 04:08:14 +0000 UTC" firstStartedPulling="2026-01-29 04:08:15.649361905 +0000 UTC m=+2449.133590810" lastFinishedPulling="2026-01-29 04:08:16.151428976 +0000 UTC m=+2449.635657881" observedRunningTime="2026-01-29 04:08:16.580792298 +0000 UTC m=+2450.065021243" watchObservedRunningTime="2026-01-29 04:08:16.585549012 +0000 UTC m=+2450.069777927" Jan 29 04:08:28 crc kubenswrapper[4707]: I0129 04:08:28.243661 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:08:28 crc kubenswrapper[4707]: E0129 04:08:28.244378 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:08:42 crc kubenswrapper[4707]: I0129 04:08:42.243990 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:08:42 crc kubenswrapper[4707]: E0129 04:08:42.244781 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:08:55 crc kubenswrapper[4707]: I0129 04:08:55.243330 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:08:55 crc kubenswrapper[4707]: E0129 04:08:55.245215 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:09:08 crc kubenswrapper[4707]: I0129 04:09:08.243609 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:09:08 crc kubenswrapper[4707]: E0129 04:09:08.244399 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:09:23 crc kubenswrapper[4707]: I0129 04:09:23.244330 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:09:23 crc kubenswrapper[4707]: E0129 04:09:23.245082 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:09:37 crc kubenswrapper[4707]: I0129 04:09:37.251134 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:09:38 crc kubenswrapper[4707]: I0129 04:09:38.273261 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"fbe00dcfbe047d7c82d09303891b3a5c6b17dc638addfcf5786780c3073d8600"} Jan 29 04:10:41 crc kubenswrapper[4707]: I0129 04:10:41.941632 4707 generic.go:334] "Generic (PLEG): container finished" podID="b8c300e6-01c5-493d-b263-2b6cdfaba0c9" containerID="f33f5a75eb4ef9972107f7a6b561467d65eeec82092bbb5e9121f5c7cd5f7510" exitCode=0 Jan 29 04:10:41 crc kubenswrapper[4707]: I0129 04:10:41.941715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" event={"ID":"b8c300e6-01c5-493d-b263-2b6cdfaba0c9","Type":"ContainerDied","Data":"f33f5a75eb4ef9972107f7a6b561467d65eeec82092bbb5e9121f5c7cd5f7510"} Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.337787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.428907 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgp7v\" (UniqueName: \"kubernetes.io/projected/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-kube-api-access-rgp7v\") pod \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.429024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-inventory\") pod \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.429626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-telemetry-combined-ca-bundle\") pod \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.430127 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-0\") pod \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.430557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ssh-key-openstack-edpm-ipam\") pod \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.430669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-1\") pod \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.430710 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-2\") pod \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\" (UID: \"b8c300e6-01c5-493d-b263-2b6cdfaba0c9\") " Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.435053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-kube-api-access-rgp7v" (OuterVolumeSpecName: "kube-api-access-rgp7v") pod "b8c300e6-01c5-493d-b263-2b6cdfaba0c9" (UID: "b8c300e6-01c5-493d-b263-2b6cdfaba0c9"). InnerVolumeSpecName "kube-api-access-rgp7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.435754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b8c300e6-01c5-493d-b263-2b6cdfaba0c9" (UID: "b8c300e6-01c5-493d-b263-2b6cdfaba0c9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.457686 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b8c300e6-01c5-493d-b263-2b6cdfaba0c9" (UID: "b8c300e6-01c5-493d-b263-2b6cdfaba0c9"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.458681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-inventory" (OuterVolumeSpecName: "inventory") pod "b8c300e6-01c5-493d-b263-2b6cdfaba0c9" (UID: "b8c300e6-01c5-493d-b263-2b6cdfaba0c9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.460525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b8c300e6-01c5-493d-b263-2b6cdfaba0c9" (UID: "b8c300e6-01c5-493d-b263-2b6cdfaba0c9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.465722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b8c300e6-01c5-493d-b263-2b6cdfaba0c9" (UID: "b8c300e6-01c5-493d-b263-2b6cdfaba0c9"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.467859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b8c300e6-01c5-493d-b263-2b6cdfaba0c9" (UID: "b8c300e6-01c5-493d-b263-2b6cdfaba0c9"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.533766 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-inventory\") on node \"crc\" DevicePath \"\"" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.533799 4707 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.533811 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.533821 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.533858 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.533869 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.533878 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgp7v\" (UniqueName: \"kubernetes.io/projected/b8c300e6-01c5-493d-b263-2b6cdfaba0c9-kube-api-access-rgp7v\") on node \"crc\" DevicePath \"\"" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.960124 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" event={"ID":"b8c300e6-01c5-493d-b263-2b6cdfaba0c9","Type":"ContainerDied","Data":"f8c59c0c6c6aaef32e77a31d181bb7048fdcc1e0710e17aa6df708c328fdfcd7"} Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.960162 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8c59c0c6c6aaef32e77a31d181bb7048fdcc1e0710e17aa6df708c328fdfcd7" Jan 29 04:10:43 crc kubenswrapper[4707]: I0129 04:10:43.960192 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.171014 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nbjhc"] Jan 29 04:11:56 crc kubenswrapper[4707]: E0129 04:11:56.172116 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c300e6-01c5-493d-b263-2b6cdfaba0c9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.172140 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c300e6-01c5-493d-b263-2b6cdfaba0c9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.172376 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c300e6-01c5-493d-b263-2b6cdfaba0c9" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.174238 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.184712 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nbjhc"] Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.326671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-catalog-content\") pod \"redhat-operators-nbjhc\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.326805 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxxdx\" (UniqueName: \"kubernetes.io/projected/bb18e1a6-42f5-46c8-8029-49f322938bce-kube-api-access-mxxdx\") pod \"redhat-operators-nbjhc\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.326891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-utilities\") pod \"redhat-operators-nbjhc\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.429057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-catalog-content\") pod \"redhat-operators-nbjhc\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.429150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxxdx\" (UniqueName: \"kubernetes.io/projected/bb18e1a6-42f5-46c8-8029-49f322938bce-kube-api-access-mxxdx\") pod \"redhat-operators-nbjhc\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.429213 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-utilities\") pod \"redhat-operators-nbjhc\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.429725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-utilities\") pod \"redhat-operators-nbjhc\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.429804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-catalog-content\") pod \"redhat-operators-nbjhc\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.453421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxxdx\" (UniqueName: \"kubernetes.io/projected/bb18e1a6-42f5-46c8-8029-49f322938bce-kube-api-access-mxxdx\") pod \"redhat-operators-nbjhc\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.534346 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:11:56 crc kubenswrapper[4707]: I0129 04:11:56.993041 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nbjhc"] Jan 29 04:11:57 crc kubenswrapper[4707]: W0129 04:11:57.004220 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb18e1a6_42f5_46c8_8029_49f322938bce.slice/crio-ebc65752b92f379b4cf36b8cbcc6f41d3ae789d4447deb3045c1145a037967a7 WatchSource:0}: Error finding container ebc65752b92f379b4cf36b8cbcc6f41d3ae789d4447deb3045c1145a037967a7: Status 404 returned error can't find the container with id ebc65752b92f379b4cf36b8cbcc6f41d3ae789d4447deb3045c1145a037967a7 Jan 29 04:11:57 crc kubenswrapper[4707]: I0129 04:11:57.701851 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerID="2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529" exitCode=0 Jan 29 04:11:57 crc kubenswrapper[4707]: I0129 04:11:57.702273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbjhc" event={"ID":"bb18e1a6-42f5-46c8-8029-49f322938bce","Type":"ContainerDied","Data":"2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529"} Jan 29 04:11:57 crc kubenswrapper[4707]: I0129 04:11:57.702303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbjhc" event={"ID":"bb18e1a6-42f5-46c8-8029-49f322938bce","Type":"ContainerStarted","Data":"ebc65752b92f379b4cf36b8cbcc6f41d3ae789d4447deb3045c1145a037967a7"} Jan 29 04:11:57 crc kubenswrapper[4707]: I0129 04:11:57.704481 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 04:11:58 crc kubenswrapper[4707]: I0129 04:11:58.712348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbjhc" event={"ID":"bb18e1a6-42f5-46c8-8029-49f322938bce","Type":"ContainerStarted","Data":"0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7"} Jan 29 04:11:59 crc kubenswrapper[4707]: I0129 04:11:59.724605 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerID="0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7" exitCode=0 Jan 29 04:11:59 crc kubenswrapper[4707]: I0129 04:11:59.724805 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbjhc" event={"ID":"bb18e1a6-42f5-46c8-8029-49f322938bce","Type":"ContainerDied","Data":"0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7"} Jan 29 04:12:00 crc kubenswrapper[4707]: I0129 04:12:00.735017 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbjhc" event={"ID":"bb18e1a6-42f5-46c8-8029-49f322938bce","Type":"ContainerStarted","Data":"cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad"} Jan 29 04:12:00 crc kubenswrapper[4707]: I0129 04:12:00.762260 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nbjhc" podStartSLOduration=2.344369067 podStartE2EDuration="4.762239928s" podCreationTimestamp="2026-01-29 04:11:56 +0000 UTC" firstStartedPulling="2026-01-29 04:11:57.704240266 +0000 UTC m=+2671.188469171" lastFinishedPulling="2026-01-29 04:12:00.122111097 +0000 UTC m=+2673.606340032" observedRunningTime="2026-01-29 04:12:00.753375589 +0000 UTC m=+2674.237604494" watchObservedRunningTime="2026-01-29 04:12:00.762239928 +0000 UTC m=+2674.246468833" Jan 29 04:12:03 crc kubenswrapper[4707]: I0129 04:12:03.463211 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:12:03 crc kubenswrapper[4707]: I0129 04:12:03.463576 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:12:06 crc kubenswrapper[4707]: I0129 04:12:06.535024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:12:06 crc kubenswrapper[4707]: I0129 04:12:06.535391 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:12:06 crc kubenswrapper[4707]: I0129 04:12:06.580266 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:12:06 crc kubenswrapper[4707]: I0129 04:12:06.843180 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:12:06 crc kubenswrapper[4707]: I0129 04:12:06.913432 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nbjhc"] Jan 29 04:12:08 crc kubenswrapper[4707]: I0129 04:12:08.811406 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nbjhc" podUID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerName="registry-server" containerID="cri-o://cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad" gracePeriod=2 Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.323569 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.426990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxxdx\" (UniqueName: \"kubernetes.io/projected/bb18e1a6-42f5-46c8-8029-49f322938bce-kube-api-access-mxxdx\") pod \"bb18e1a6-42f5-46c8-8029-49f322938bce\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.427332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-catalog-content\") pod \"bb18e1a6-42f5-46c8-8029-49f322938bce\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.427460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-utilities\") pod \"bb18e1a6-42f5-46c8-8029-49f322938bce\" (UID: \"bb18e1a6-42f5-46c8-8029-49f322938bce\") " Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.429405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-utilities" (OuterVolumeSpecName: "utilities") pod "bb18e1a6-42f5-46c8-8029-49f322938bce" (UID: "bb18e1a6-42f5-46c8-8029-49f322938bce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.434940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb18e1a6-42f5-46c8-8029-49f322938bce-kube-api-access-mxxdx" (OuterVolumeSpecName: "kube-api-access-mxxdx") pod "bb18e1a6-42f5-46c8-8029-49f322938bce" (UID: "bb18e1a6-42f5-46c8-8029-49f322938bce"). InnerVolumeSpecName "kube-api-access-mxxdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.530455 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxxdx\" (UniqueName: \"kubernetes.io/projected/bb18e1a6-42f5-46c8-8029-49f322938bce-kube-api-access-mxxdx\") on node \"crc\" DevicePath \"\"" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.530518 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.559406 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb18e1a6-42f5-46c8-8029-49f322938bce" (UID: "bb18e1a6-42f5-46c8-8029-49f322938bce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.633205 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb18e1a6-42f5-46c8-8029-49f322938bce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.830239 4707 generic.go:334] "Generic (PLEG): container finished" podID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerID="cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad" exitCode=0 Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.830363 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nbjhc" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.830392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbjhc" event={"ID":"bb18e1a6-42f5-46c8-8029-49f322938bce","Type":"ContainerDied","Data":"cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad"} Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.841425 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbjhc" event={"ID":"bb18e1a6-42f5-46c8-8029-49f322938bce","Type":"ContainerDied","Data":"ebc65752b92f379b4cf36b8cbcc6f41d3ae789d4447deb3045c1145a037967a7"} Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.841583 4707 scope.go:117] "RemoveContainer" containerID="cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.873810 4707 scope.go:117] "RemoveContainer" containerID="0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.883609 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nbjhc"] Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.893945 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nbjhc"] Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.902143 4707 scope.go:117] "RemoveContainer" containerID="2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.947092 4707 scope.go:117] "RemoveContainer" containerID="cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad" Jan 29 04:12:09 crc kubenswrapper[4707]: E0129 04:12:09.947700 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad\": container with ID starting with cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad not found: ID does not exist" containerID="cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.947754 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad"} err="failed to get container status \"cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad\": rpc error: code = NotFound desc = could not find container \"cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad\": container with ID starting with cb219e042010bd599f80e3fc570baa90e9f1ac98ed19f8f4483b466aa83d47ad not found: ID does not exist" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.947789 4707 scope.go:117] "RemoveContainer" containerID="0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7" Jan 29 04:12:09 crc kubenswrapper[4707]: E0129 04:12:09.948203 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7\": container with ID starting with 0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7 not found: ID does not exist" containerID="0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.948237 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7"} err="failed to get container status \"0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7\": rpc error: code = NotFound desc = could not find container \"0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7\": container with ID starting with 0768ed27c53b0035ff9413ab4464b76c17dbb44146eb2d1850d5b9c8a48cb9e7 not found: ID does not exist" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.948258 4707 scope.go:117] "RemoveContainer" containerID="2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529" Jan 29 04:12:09 crc kubenswrapper[4707]: E0129 04:12:09.948608 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529\": container with ID starting with 2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529 not found: ID does not exist" containerID="2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529" Jan 29 04:12:09 crc kubenswrapper[4707]: I0129 04:12:09.948652 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529"} err="failed to get container status \"2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529\": rpc error: code = NotFound desc = could not find container \"2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529\": container with ID starting with 2f40ee87f51bb94cd6b941568228910a3bf919744a7f4d134fbfc77ec5e4c529 not found: ID does not exist" Jan 29 04:12:11 crc kubenswrapper[4707]: I0129 04:12:11.263466 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb18e1a6-42f5-46c8-8029-49f322938bce" path="/var/lib/kubelet/pods/bb18e1a6-42f5-46c8-8029-49f322938bce/volumes" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.434426 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-clzjf"] Jan 29 04:12:30 crc kubenswrapper[4707]: E0129 04:12:30.437562 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerName="extract-content" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.437678 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerName="extract-content" Jan 29 04:12:30 crc kubenswrapper[4707]: E0129 04:12:30.437767 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerName="registry-server" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.437850 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerName="registry-server" Jan 29 04:12:30 crc kubenswrapper[4707]: E0129 04:12:30.437944 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerName="extract-utilities" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.438011 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerName="extract-utilities" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.438356 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb18e1a6-42f5-46c8-8029-49f322938bce" containerName="registry-server" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.441151 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.473524 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clzjf"] Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.596863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-utilities\") pod \"community-operators-clzjf\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.596927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-kube-api-access-rgmhl\") pod \"community-operators-clzjf\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.597056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-catalog-content\") pod \"community-operators-clzjf\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.698687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-utilities\") pod \"community-operators-clzjf\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.698769 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-kube-api-access-rgmhl\") pod \"community-operators-clzjf\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.698932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-catalog-content\") pod \"community-operators-clzjf\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.699955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-catalog-content\") pod \"community-operators-clzjf\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.700097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-utilities\") pod \"community-operators-clzjf\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.728865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-kube-api-access-rgmhl\") pod \"community-operators-clzjf\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:30 crc kubenswrapper[4707]: I0129 04:12:30.773375 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:31 crc kubenswrapper[4707]: I0129 04:12:31.374947 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clzjf"] Jan 29 04:12:32 crc kubenswrapper[4707]: I0129 04:12:32.074135 4707 generic.go:334] "Generic (PLEG): container finished" podID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerID="2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5" exitCode=0 Jan 29 04:12:32 crc kubenswrapper[4707]: I0129 04:12:32.074302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clzjf" event={"ID":"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4","Type":"ContainerDied","Data":"2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5"} Jan 29 04:12:32 crc kubenswrapper[4707]: I0129 04:12:32.074722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clzjf" event={"ID":"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4","Type":"ContainerStarted","Data":"29beb56ff0db8d65d197070d7ccbb1efd02142f4e46bdcc63a8a9685c704918e"} Jan 29 04:12:33 crc kubenswrapper[4707]: I0129 04:12:33.092024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clzjf" event={"ID":"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4","Type":"ContainerStarted","Data":"fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699"} Jan 29 04:12:33 crc kubenswrapper[4707]: I0129 04:12:33.462909 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:12:33 crc kubenswrapper[4707]: I0129 04:12:33.463001 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:12:34 crc kubenswrapper[4707]: I0129 04:12:34.108516 4707 generic.go:334] "Generic (PLEG): container finished" podID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerID="fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699" exitCode=0 Jan 29 04:12:34 crc kubenswrapper[4707]: I0129 04:12:34.108979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clzjf" event={"ID":"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4","Type":"ContainerDied","Data":"fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699"} Jan 29 04:12:35 crc kubenswrapper[4707]: I0129 04:12:35.126053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clzjf" event={"ID":"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4","Type":"ContainerStarted","Data":"e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348"} Jan 29 04:12:35 crc kubenswrapper[4707]: I0129 04:12:35.152480 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-clzjf" podStartSLOduration=2.681869182 podStartE2EDuration="5.152456507s" podCreationTimestamp="2026-01-29 04:12:30 +0000 UTC" firstStartedPulling="2026-01-29 04:12:32.077005273 +0000 UTC m=+2705.561234168" lastFinishedPulling="2026-01-29 04:12:34.547592588 +0000 UTC m=+2708.031821493" observedRunningTime="2026-01-29 04:12:35.145198913 +0000 UTC m=+2708.629427818" watchObservedRunningTime="2026-01-29 04:12:35.152456507 +0000 UTC m=+2708.636685402" Jan 29 04:12:40 crc kubenswrapper[4707]: I0129 04:12:40.773855 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:40 crc kubenswrapper[4707]: I0129 04:12:40.775152 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:40 crc kubenswrapper[4707]: I0129 04:12:40.857349 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:41 crc kubenswrapper[4707]: I0129 04:12:41.286746 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:41 crc kubenswrapper[4707]: I0129 04:12:41.369302 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clzjf"] Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.231371 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-clzjf" podUID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerName="registry-server" containerID="cri-o://e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348" gracePeriod=2 Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.745908 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.855795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-kube-api-access-rgmhl\") pod \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.856038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-catalog-content\") pod \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.856079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-utilities\") pod \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\" (UID: \"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4\") " Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.858792 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-utilities" (OuterVolumeSpecName: "utilities") pod "28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" (UID: "28ec3dcc-5ca6-433a-ac7d-4b46704eeff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.876976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-kube-api-access-rgmhl" (OuterVolumeSpecName: "kube-api-access-rgmhl") pod "28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" (UID: "28ec3dcc-5ca6-433a-ac7d-4b46704eeff4"). InnerVolumeSpecName "kube-api-access-rgmhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.941498 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" (UID: "28ec3dcc-5ca6-433a-ac7d-4b46704eeff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.958686 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.958728 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:12:43 crc kubenswrapper[4707]: I0129 04:12:43.958740 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4-kube-api-access-rgmhl\") on node \"crc\" DevicePath \"\"" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.252734 4707 generic.go:334] "Generic (PLEG): container finished" podID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerID="e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348" exitCode=0 Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.252819 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clzjf" event={"ID":"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4","Type":"ContainerDied","Data":"e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348"} Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.252870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clzjf" event={"ID":"28ec3dcc-5ca6-433a-ac7d-4b46704eeff4","Type":"ContainerDied","Data":"29beb56ff0db8d65d197070d7ccbb1efd02142f4e46bdcc63a8a9685c704918e"} Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.252906 4707 scope.go:117] "RemoveContainer" containerID="e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.253137 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clzjf" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.288967 4707 scope.go:117] "RemoveContainer" containerID="fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.318172 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clzjf"] Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.327746 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-clzjf"] Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.336127 4707 scope.go:117] "RemoveContainer" containerID="2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.394407 4707 scope.go:117] "RemoveContainer" containerID="e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348" Jan 29 04:12:44 crc kubenswrapper[4707]: E0129 04:12:44.395781 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348\": container with ID starting with e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348 not found: ID does not exist" containerID="e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.395836 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348"} err="failed to get container status \"e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348\": rpc error: code = NotFound desc = could not find container \"e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348\": container with ID starting with e85d12e49c85d22e9c32a7da844c03afc7f90c6902f3e1c41189fc5a8ce43348 not found: ID does not exist" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.395881 4707 scope.go:117] "RemoveContainer" containerID="fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699" Jan 29 04:12:44 crc kubenswrapper[4707]: E0129 04:12:44.396393 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699\": container with ID starting with fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699 not found: ID does not exist" containerID="fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.396436 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699"} err="failed to get container status \"fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699\": rpc error: code = NotFound desc = could not find container \"fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699\": container with ID starting with fcb31f1e218fd5c616816155a7b1b6b3cfa338bbe4e2221681df78e2d500b699 not found: ID does not exist" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.396462 4707 scope.go:117] "RemoveContainer" containerID="2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5" Jan 29 04:12:44 crc kubenswrapper[4707]: E0129 04:12:44.397612 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5\": container with ID starting with 2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5 not found: ID does not exist" containerID="2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5" Jan 29 04:12:44 crc kubenswrapper[4707]: I0129 04:12:44.397746 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5"} err="failed to get container status \"2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5\": rpc error: code = NotFound desc = could not find container \"2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5\": container with ID starting with 2d331e51e77bb2b2a5808d452c4162e17668a128533b87f43eede1e40e2f2cf5 not found: ID does not exist" Jan 29 04:12:45 crc kubenswrapper[4707]: I0129 04:12:45.265905 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" path="/var/lib/kubelet/pods/28ec3dcc-5ca6-433a-ac7d-4b46704eeff4/volumes" Jan 29 04:13:03 crc kubenswrapper[4707]: I0129 04:13:03.463714 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:13:03 crc kubenswrapper[4707]: I0129 04:13:03.464392 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:13:03 crc kubenswrapper[4707]: I0129 04:13:03.464456 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 04:13:03 crc kubenswrapper[4707]: I0129 04:13:03.465367 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fbe00dcfbe047d7c82d09303891b3a5c6b17dc638addfcf5786780c3073d8600"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 04:13:03 crc kubenswrapper[4707]: I0129 04:13:03.465442 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://fbe00dcfbe047d7c82d09303891b3a5c6b17dc638addfcf5786780c3073d8600" gracePeriod=600 Jan 29 04:13:04 crc kubenswrapper[4707]: I0129 04:13:04.498934 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="fbe00dcfbe047d7c82d09303891b3a5c6b17dc638addfcf5786780c3073d8600" exitCode=0 Jan 29 04:13:04 crc kubenswrapper[4707]: I0129 04:13:04.499036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"fbe00dcfbe047d7c82d09303891b3a5c6b17dc638addfcf5786780c3073d8600"} Jan 29 04:13:04 crc kubenswrapper[4707]: I0129 04:13:04.500971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7"} Jan 29 04:13:04 crc kubenswrapper[4707]: I0129 04:13:04.501067 4707 scope.go:117] "RemoveContainer" containerID="034628fc76db5048735f4a185d370dcb515f7aec75f81dae4530e27f66a44024" Jan 29 04:13:29 crc kubenswrapper[4707]: I0129 04:13:29.465685 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7886d5cc69-w8rzq_0a32b73c-f66f-425f-81a9-ef1cc36041d4/manager/0.log" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.644007 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.644909 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f44781c6-75de-479b-b6bb-33bc27d468fa" containerName="openstackclient" containerID="cri-o://0932b0ac3dc2b05af0e58c966c543784158e63ace8b7014e5b797132c779395b" gracePeriod=2 Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.657770 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.689862 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 29 04:13:31 crc kubenswrapper[4707]: E0129 04:13:31.690502 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerName="registry-server" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.690530 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerName="registry-server" Jan 29 04:13:31 crc kubenswrapper[4707]: E0129 04:13:31.690963 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerName="extract-utilities" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.690988 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerName="extract-utilities" Jan 29 04:13:31 crc kubenswrapper[4707]: E0129 04:13:31.691013 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerName="extract-content" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.691023 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerName="extract-content" Jan 29 04:13:31 crc kubenswrapper[4707]: E0129 04:13:31.691044 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44781c6-75de-479b-b6bb-33bc27d468fa" containerName="openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.691052 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44781c6-75de-479b-b6bb-33bc27d468fa" containerName="openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.691317 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ec3dcc-5ca6-433a-ac7d-4b46704eeff4" containerName="registry-server" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.691335 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44781c6-75de-479b-b6bb-33bc27d468fa" containerName="openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.692839 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.704650 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.734515 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f44781c6-75de-479b-b6bb-33bc27d468fa" podUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.824335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.825353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52grj\" (UniqueName: \"kubernetes.io/projected/12d355a2-5cc3-43c5-96b0-b11f83de869d-kube-api-access-52grj\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.826940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.827342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config-secret\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.929113 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.929227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config-secret\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.929290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.929329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52grj\" (UniqueName: \"kubernetes.io/projected/12d355a2-5cc3-43c5-96b0-b11f83de869d-kube-api-access-52grj\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.930635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.937814 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config-secret\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.939185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:31 crc kubenswrapper[4707]: I0129 04:13:31.954594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52grj\" (UniqueName: \"kubernetes.io/projected/12d355a2-5cc3-43c5-96b0-b11f83de869d-kube-api-access-52grj\") pod \"openstackclient\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " pod="openstack/openstackclient" Jan 29 04:13:32 crc kubenswrapper[4707]: I0129 04:13:32.031978 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 04:13:32 crc kubenswrapper[4707]: I0129 04:13:32.620404 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 04:13:32 crc kubenswrapper[4707]: I0129 04:13:32.814645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"12d355a2-5cc3-43c5-96b0-b11f83de869d","Type":"ContainerStarted","Data":"886f804f5c94698f36a55193b79e9948d7246e23d6fac44464ea8f3b4a8ef9f2"} Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.073434 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-zm8bg"] Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.075500 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.086970 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-zm8bg"] Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.185153 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-c6cd-account-create-update-68v6b"] Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.186437 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.188911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.198547 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-c6cd-account-create-update-68v6b"] Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.260412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d200ad2-aa0f-468d-92ac-0563af93b582-operator-scripts\") pod \"aodh-db-create-zm8bg\" (UID: \"7d200ad2-aa0f-468d-92ac-0563af93b582\") " pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.260504 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7sk\" (UniqueName: \"kubernetes.io/projected/7d200ad2-aa0f-468d-92ac-0563af93b582-kube-api-access-nr7sk\") pod \"aodh-db-create-zm8bg\" (UID: \"7d200ad2-aa0f-468d-92ac-0563af93b582\") " pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.363214 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7sk\" (UniqueName: \"kubernetes.io/projected/7d200ad2-aa0f-468d-92ac-0563af93b582-kube-api-access-nr7sk\") pod \"aodh-db-create-zm8bg\" (UID: \"7d200ad2-aa0f-468d-92ac-0563af93b582\") " pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.363349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-operator-scripts\") pod \"aodh-c6cd-account-create-update-68v6b\" (UID: \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\") " pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.363410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqvwp\" (UniqueName: \"kubernetes.io/projected/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-kube-api-access-jqvwp\") pod \"aodh-c6cd-account-create-update-68v6b\" (UID: \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\") " pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.363480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d200ad2-aa0f-468d-92ac-0563af93b582-operator-scripts\") pod \"aodh-db-create-zm8bg\" (UID: \"7d200ad2-aa0f-468d-92ac-0563af93b582\") " pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.364555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d200ad2-aa0f-468d-92ac-0563af93b582-operator-scripts\") pod \"aodh-db-create-zm8bg\" (UID: \"7d200ad2-aa0f-468d-92ac-0563af93b582\") " pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.396517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7sk\" (UniqueName: \"kubernetes.io/projected/7d200ad2-aa0f-468d-92ac-0563af93b582-kube-api-access-nr7sk\") pod \"aodh-db-create-zm8bg\" (UID: \"7d200ad2-aa0f-468d-92ac-0563af93b582\") " pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.447953 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.466344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-operator-scripts\") pod \"aodh-c6cd-account-create-update-68v6b\" (UID: \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\") " pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.466483 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqvwp\" (UniqueName: \"kubernetes.io/projected/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-kube-api-access-jqvwp\") pod \"aodh-c6cd-account-create-update-68v6b\" (UID: \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\") " pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.467795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-operator-scripts\") pod \"aodh-c6cd-account-create-update-68v6b\" (UID: \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\") " pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.493488 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqvwp\" (UniqueName: \"kubernetes.io/projected/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-kube-api-access-jqvwp\") pod \"aodh-c6cd-account-create-update-68v6b\" (UID: \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\") " pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.506145 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.833601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"12d355a2-5cc3-43c5-96b0-b11f83de869d","Type":"ContainerStarted","Data":"cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8"} Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.841339 4707 generic.go:334] "Generic (PLEG): container finished" podID="f44781c6-75de-479b-b6bb-33bc27d468fa" containerID="0932b0ac3dc2b05af0e58c966c543784158e63ace8b7014e5b797132c779395b" exitCode=137 Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.858705 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.858686789 podStartE2EDuration="2.858686789s" podCreationTimestamp="2026-01-29 04:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:13:33.854348416 +0000 UTC m=+2767.338577321" watchObservedRunningTime="2026-01-29 04:13:33.858686789 +0000 UTC m=+2767.342915694" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.944052 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 04:13:33 crc kubenswrapper[4707]: I0129 04:13:33.982500 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-zm8bg"] Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.084263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config\") pod \"f44781c6-75de-479b-b6bb-33bc27d468fa\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.084326 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config-secret\") pod \"f44781c6-75de-479b-b6bb-33bc27d468fa\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.084388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbc25\" (UniqueName: \"kubernetes.io/projected/f44781c6-75de-479b-b6bb-33bc27d468fa-kube-api-access-zbc25\") pod \"f44781c6-75de-479b-b6bb-33bc27d468fa\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.084402 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-combined-ca-bundle\") pod \"f44781c6-75de-479b-b6bb-33bc27d468fa\" (UID: \"f44781c6-75de-479b-b6bb-33bc27d468fa\") " Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.094119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44781c6-75de-479b-b6bb-33bc27d468fa-kube-api-access-zbc25" (OuterVolumeSpecName: "kube-api-access-zbc25") pod "f44781c6-75de-479b-b6bb-33bc27d468fa" (UID: "f44781c6-75de-479b-b6bb-33bc27d468fa"). InnerVolumeSpecName "kube-api-access-zbc25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.111031 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-c6cd-account-create-update-68v6b"] Jan 29 04:13:34 crc kubenswrapper[4707]: W0129 04:13:34.112381 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b884bf3_5ea2_4aa6_b29f_ddb49c3f94b1.slice/crio-68744753cc7dcda69b638db3f113d93fac3f5e8ca377842ca62c13217a2d1077 WatchSource:0}: Error finding container 68744753cc7dcda69b638db3f113d93fac3f5e8ca377842ca62c13217a2d1077: Status 404 returned error can't find the container with id 68744753cc7dcda69b638db3f113d93fac3f5e8ca377842ca62c13217a2d1077 Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.134008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f44781c6-75de-479b-b6bb-33bc27d468fa" (UID: "f44781c6-75de-479b-b6bb-33bc27d468fa"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.149134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f44781c6-75de-479b-b6bb-33bc27d468fa" (UID: "f44781c6-75de-479b-b6bb-33bc27d468fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.173012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f44781c6-75de-479b-b6bb-33bc27d468fa" (UID: "f44781c6-75de-479b-b6bb-33bc27d468fa"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.189170 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbc25\" (UniqueName: \"kubernetes.io/projected/f44781c6-75de-479b-b6bb-33bc27d468fa-kube-api-access-zbc25\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.189215 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.189226 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.189242 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f44781c6-75de-479b-b6bb-33bc27d468fa-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.858863 4707 generic.go:334] "Generic (PLEG): container finished" podID="5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1" containerID="c48b6e2824a85ef22836018a375a237791d888983bf2b4628081cd0e862c3a85" exitCode=0 Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.858983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-c6cd-account-create-update-68v6b" event={"ID":"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1","Type":"ContainerDied","Data":"c48b6e2824a85ef22836018a375a237791d888983bf2b4628081cd0e862c3a85"} Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.859070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-c6cd-account-create-update-68v6b" event={"ID":"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1","Type":"ContainerStarted","Data":"68744753cc7dcda69b638db3f113d93fac3f5e8ca377842ca62c13217a2d1077"} Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.870392 4707 scope.go:117] "RemoveContainer" containerID="0932b0ac3dc2b05af0e58c966c543784158e63ace8b7014e5b797132c779395b" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.870730 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.888211 4707 generic.go:334] "Generic (PLEG): container finished" podID="7d200ad2-aa0f-468d-92ac-0563af93b582" containerID="5e866acd235ea292f5d705af25827b8dab3a3533aead98e43b94854731d335bc" exitCode=0 Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.890165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-zm8bg" event={"ID":"7d200ad2-aa0f-468d-92ac-0563af93b582","Type":"ContainerDied","Data":"5e866acd235ea292f5d705af25827b8dab3a3533aead98e43b94854731d335bc"} Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.890203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-zm8bg" event={"ID":"7d200ad2-aa0f-468d-92ac-0563af93b582","Type":"ContainerStarted","Data":"f1327a3313f52a59c1748823702bda8de9879ca98cb042572027eb5f50c793ff"} Jan 29 04:13:34 crc kubenswrapper[4707]: I0129 04:13:34.912303 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f44781c6-75de-479b-b6bb-33bc27d468fa" podUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" Jan 29 04:13:35 crc kubenswrapper[4707]: I0129 04:13:35.256418 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f44781c6-75de-479b-b6bb-33bc27d468fa" path="/var/lib/kubelet/pods/f44781c6-75de-479b-b6bb-33bc27d468fa/volumes" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.384567 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.476587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr7sk\" (UniqueName: \"kubernetes.io/projected/7d200ad2-aa0f-468d-92ac-0563af93b582-kube-api-access-nr7sk\") pod \"7d200ad2-aa0f-468d-92ac-0563af93b582\" (UID: \"7d200ad2-aa0f-468d-92ac-0563af93b582\") " Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.476715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d200ad2-aa0f-468d-92ac-0563af93b582-operator-scripts\") pod \"7d200ad2-aa0f-468d-92ac-0563af93b582\" (UID: \"7d200ad2-aa0f-468d-92ac-0563af93b582\") " Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.477280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d200ad2-aa0f-468d-92ac-0563af93b582-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d200ad2-aa0f-468d-92ac-0563af93b582" (UID: "7d200ad2-aa0f-468d-92ac-0563af93b582"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.483294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d200ad2-aa0f-468d-92ac-0563af93b582-kube-api-access-nr7sk" (OuterVolumeSpecName: "kube-api-access-nr7sk") pod "7d200ad2-aa0f-468d-92ac-0563af93b582" (UID: "7d200ad2-aa0f-468d-92ac-0563af93b582"). InnerVolumeSpecName "kube-api-access-nr7sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.559229 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.580353 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqvwp\" (UniqueName: \"kubernetes.io/projected/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-kube-api-access-jqvwp\") pod \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\" (UID: \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\") " Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.580604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-operator-scripts\") pod \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\" (UID: \"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1\") " Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.581078 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr7sk\" (UniqueName: \"kubernetes.io/projected/7d200ad2-aa0f-468d-92ac-0563af93b582-kube-api-access-nr7sk\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.581103 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d200ad2-aa0f-468d-92ac-0563af93b582-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.581614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1" (UID: "5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.588753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-kube-api-access-jqvwp" (OuterVolumeSpecName: "kube-api-access-jqvwp") pod "5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1" (UID: "5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1"). InnerVolumeSpecName "kube-api-access-jqvwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.683420 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.683461 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqvwp\" (UniqueName: \"kubernetes.io/projected/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1-kube-api-access-jqvwp\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.913033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-c6cd-account-create-update-68v6b" event={"ID":"5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1","Type":"ContainerDied","Data":"68744753cc7dcda69b638db3f113d93fac3f5e8ca377842ca62c13217a2d1077"} Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.913532 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68744753cc7dcda69b638db3f113d93fac3f5e8ca377842ca62c13217a2d1077" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.913197 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c6cd-account-create-update-68v6b" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.917364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-zm8bg" event={"ID":"7d200ad2-aa0f-468d-92ac-0563af93b582","Type":"ContainerDied","Data":"f1327a3313f52a59c1748823702bda8de9879ca98cb042572027eb5f50c793ff"} Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.917473 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1327a3313f52a59c1748823702bda8de9879ca98cb042572027eb5f50c793ff" Jan 29 04:13:36 crc kubenswrapper[4707]: I0129 04:13:36.917604 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-zm8bg" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.435044 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-m4nxz"] Jan 29 04:13:38 crc kubenswrapper[4707]: E0129 04:13:38.436002 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d200ad2-aa0f-468d-92ac-0563af93b582" containerName="mariadb-database-create" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.436022 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d200ad2-aa0f-468d-92ac-0563af93b582" containerName="mariadb-database-create" Jan 29 04:13:38 crc kubenswrapper[4707]: E0129 04:13:38.436039 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1" containerName="mariadb-account-create-update" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.436048 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1" containerName="mariadb-account-create-update" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.436270 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1" containerName="mariadb-account-create-update" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.436302 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d200ad2-aa0f-468d-92ac-0563af93b582" containerName="mariadb-database-create" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.437075 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.445907 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.446269 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-8fk7t" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.446428 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.446608 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.454658 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-m4nxz"] Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.544299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-scripts\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.544592 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq28f\" (UniqueName: \"kubernetes.io/projected/b75fe528-48a1-41ae-af17-80199293c062-kube-api-access-zq28f\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.544655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-combined-ca-bundle\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.544760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-config-data\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.647155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-scripts\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.647469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq28f\" (UniqueName: \"kubernetes.io/projected/b75fe528-48a1-41ae-af17-80199293c062-kube-api-access-zq28f\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.647597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-combined-ca-bundle\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.647689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-config-data\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.655709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-combined-ca-bundle\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.656724 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-config-data\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.659041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-scripts\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.675712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq28f\" (UniqueName: \"kubernetes.io/projected/b75fe528-48a1-41ae-af17-80199293c062-kube-api-access-zq28f\") pod \"aodh-db-sync-m4nxz\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:38 crc kubenswrapper[4707]: I0129 04:13:38.765553 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:39 crc kubenswrapper[4707]: I0129 04:13:39.304275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-m4nxz"] Jan 29 04:13:39 crc kubenswrapper[4707]: I0129 04:13:39.959959 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-m4nxz" event={"ID":"b75fe528-48a1-41ae-af17-80199293c062","Type":"ContainerStarted","Data":"9a6efc1f1b92c3a3104aba6d4c9731921fd4b2149c9c884ba6c16f211dd7d953"} Jan 29 04:13:45 crc kubenswrapper[4707]: I0129 04:13:45.020779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-m4nxz" event={"ID":"b75fe528-48a1-41ae-af17-80199293c062","Type":"ContainerStarted","Data":"51fe567c741545a7aa2afebc71201117bf1762413d32dc0db760f8c31f44fc27"} Jan 29 04:13:47 crc kubenswrapper[4707]: I0129 04:13:47.037389 4707 generic.go:334] "Generic (PLEG): container finished" podID="b75fe528-48a1-41ae-af17-80199293c062" containerID="51fe567c741545a7aa2afebc71201117bf1762413d32dc0db760f8c31f44fc27" exitCode=0 Jan 29 04:13:47 crc kubenswrapper[4707]: I0129 04:13:47.037478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-m4nxz" event={"ID":"b75fe528-48a1-41ae-af17-80199293c062","Type":"ContainerDied","Data":"51fe567c741545a7aa2afebc71201117bf1762413d32dc0db760f8c31f44fc27"} Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.516773 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.607532 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-combined-ca-bundle\") pod \"b75fe528-48a1-41ae-af17-80199293c062\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.607777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-scripts\") pod \"b75fe528-48a1-41ae-af17-80199293c062\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.607924 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq28f\" (UniqueName: \"kubernetes.io/projected/b75fe528-48a1-41ae-af17-80199293c062-kube-api-access-zq28f\") pod \"b75fe528-48a1-41ae-af17-80199293c062\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.607996 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-config-data\") pod \"b75fe528-48a1-41ae-af17-80199293c062\" (UID: \"b75fe528-48a1-41ae-af17-80199293c062\") " Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.616019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b75fe528-48a1-41ae-af17-80199293c062-kube-api-access-zq28f" (OuterVolumeSpecName: "kube-api-access-zq28f") pod "b75fe528-48a1-41ae-af17-80199293c062" (UID: "b75fe528-48a1-41ae-af17-80199293c062"). InnerVolumeSpecName "kube-api-access-zq28f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.616741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-scripts" (OuterVolumeSpecName: "scripts") pod "b75fe528-48a1-41ae-af17-80199293c062" (UID: "b75fe528-48a1-41ae-af17-80199293c062"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.643489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-config-data" (OuterVolumeSpecName: "config-data") pod "b75fe528-48a1-41ae-af17-80199293c062" (UID: "b75fe528-48a1-41ae-af17-80199293c062"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.644557 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b75fe528-48a1-41ae-af17-80199293c062" (UID: "b75fe528-48a1-41ae-af17-80199293c062"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.710751 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.710781 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.710790 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq28f\" (UniqueName: \"kubernetes.io/projected/b75fe528-48a1-41ae-af17-80199293c062-kube-api-access-zq28f\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:48 crc kubenswrapper[4707]: I0129 04:13:48.710799 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75fe528-48a1-41ae-af17-80199293c062-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:49 crc kubenswrapper[4707]: I0129 04:13:49.076695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-m4nxz" event={"ID":"b75fe528-48a1-41ae-af17-80199293c062","Type":"ContainerDied","Data":"9a6efc1f1b92c3a3104aba6d4c9731921fd4b2149c9c884ba6c16f211dd7d953"} Jan 29 04:13:49 crc kubenswrapper[4707]: I0129 04:13:49.077067 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a6efc1f1b92c3a3104aba6d4c9731921fd4b2149c9c884ba6c16f211dd7d953" Jan 29 04:13:49 crc kubenswrapper[4707]: I0129 04:13:49.076770 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-m4nxz" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.261698 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 29 04:13:53 crc kubenswrapper[4707]: E0129 04:13:53.262907 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75fe528-48a1-41ae-af17-80199293c062" containerName="aodh-db-sync" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.262924 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75fe528-48a1-41ae-af17-80199293c062" containerName="aodh-db-sync" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.263202 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75fe528-48a1-41ae-af17-80199293c062" containerName="aodh-db-sync" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.265514 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.265858 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.272234 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.276723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-8fk7t" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.277116 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.368485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-scripts\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.368611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.368734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t69k4\" (UniqueName: \"kubernetes.io/projected/9118c663-76c2-40bf-b6c0-d72d12d6dca1-kube-api-access-t69k4\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.368766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-config-data\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.470673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.470840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t69k4\" (UniqueName: \"kubernetes.io/projected/9118c663-76c2-40bf-b6c0-d72d12d6dca1-kube-api-access-t69k4\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.470879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-config-data\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.470955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-scripts\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.478675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.480649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-scripts\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.482660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-config-data\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.490072 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t69k4\" (UniqueName: \"kubernetes.io/projected/9118c663-76c2-40bf-b6c0-d72d12d6dca1-kube-api-access-t69k4\") pod \"aodh-0\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " pod="openstack/aodh-0" Jan 29 04:13:53 crc kubenswrapper[4707]: I0129 04:13:53.633233 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:13:54 crc kubenswrapper[4707]: I0129 04:13:54.137085 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:13:54 crc kubenswrapper[4707]: W0129 04:13:54.139307 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9118c663_76c2_40bf_b6c0_d72d12d6dca1.slice/crio-1b7b1692b8afa5f66e39d8cab2e6eda92c58fa35d661c1d331153c81e7a43462 WatchSource:0}: Error finding container 1b7b1692b8afa5f66e39d8cab2e6eda92c58fa35d661c1d331153c81e7a43462: Status 404 returned error can't find the container with id 1b7b1692b8afa5f66e39d8cab2e6eda92c58fa35d661c1d331153c81e7a43462 Jan 29 04:13:55 crc kubenswrapper[4707]: I0129 04:13:55.146079 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerStarted","Data":"075cc51254ac7c32ac05d354a2cafb5d2ec7ff354c438b5ed6e04d04600fa134"} Jan 29 04:13:55 crc kubenswrapper[4707]: I0129 04:13:55.147074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerStarted","Data":"1b7b1692b8afa5f66e39d8cab2e6eda92c58fa35d661c1d331153c81e7a43462"} Jan 29 04:13:55 crc kubenswrapper[4707]: I0129 04:13:55.539335 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 04:13:55 crc kubenswrapper[4707]: I0129 04:13:55.539605 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="ceilometer-central-agent" containerID="cri-o://4d0e2d9b1eec9cc61f9f0651e523a9e4d75212d47403824ecd3e4321067f63bc" gracePeriod=30 Jan 29 04:13:55 crc kubenswrapper[4707]: I0129 04:13:55.540078 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="proxy-httpd" containerID="cri-o://3365cdbdc6e7a511395c02dd233d4de53a208ff117c4d707c42d3390af860ec2" gracePeriod=30 Jan 29 04:13:55 crc kubenswrapper[4707]: I0129 04:13:55.540133 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="sg-core" containerID="cri-o://4fcf6d6105ffe2ba61f0f3d63506f3c694c6c6b4f7d0dc7f810006808e80f883" gracePeriod=30 Jan 29 04:13:55 crc kubenswrapper[4707]: I0129 04:13:55.540165 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="ceilometer-notification-agent" containerID="cri-o://e6f50d4dacd63a73e289ee94fe2585ec411ded08580d9d073e076fae45c8692b" gracePeriod=30 Jan 29 04:13:55 crc kubenswrapper[4707]: E0129 04:13:55.826637 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7470b455_8eb5_43ed_85cd_ad132974a76e.slice/crio-conmon-4fcf6d6105ffe2ba61f0f3d63506f3c694c6c6b4f7d0dc7f810006808e80f883.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7470b455_8eb5_43ed_85cd_ad132974a76e.slice/crio-4fcf6d6105ffe2ba61f0f3d63506f3c694c6c6b4f7d0dc7f810006808e80f883.scope\": RecentStats: unable to find data in memory cache]" Jan 29 04:13:56 crc kubenswrapper[4707]: I0129 04:13:56.169591 4707 generic.go:334] "Generic (PLEG): container finished" podID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerID="3365cdbdc6e7a511395c02dd233d4de53a208ff117c4d707c42d3390af860ec2" exitCode=0 Jan 29 04:13:56 crc kubenswrapper[4707]: I0129 04:13:56.169643 4707 generic.go:334] "Generic (PLEG): container finished" podID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerID="4fcf6d6105ffe2ba61f0f3d63506f3c694c6c6b4f7d0dc7f810006808e80f883" exitCode=2 Jan 29 04:13:56 crc kubenswrapper[4707]: I0129 04:13:56.169651 4707 generic.go:334] "Generic (PLEG): container finished" podID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerID="4d0e2d9b1eec9cc61f9f0651e523a9e4d75212d47403824ecd3e4321067f63bc" exitCode=0 Jan 29 04:13:56 crc kubenswrapper[4707]: I0129 04:13:56.169743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerDied","Data":"3365cdbdc6e7a511395c02dd233d4de53a208ff117c4d707c42d3390af860ec2"} Jan 29 04:13:56 crc kubenswrapper[4707]: I0129 04:13:56.169770 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerDied","Data":"4fcf6d6105ffe2ba61f0f3d63506f3c694c6c6b4f7d0dc7f810006808e80f883"} Jan 29 04:13:56 crc kubenswrapper[4707]: I0129 04:13:56.169781 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerDied","Data":"4d0e2d9b1eec9cc61f9f0651e523a9e4d75212d47403824ecd3e4321067f63bc"} Jan 29 04:13:56 crc kubenswrapper[4707]: I0129 04:13:56.639011 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.192311 4707 generic.go:334] "Generic (PLEG): container finished" podID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerID="e6f50d4dacd63a73e289ee94fe2585ec411ded08580d9d073e076fae45c8692b" exitCode=0 Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.192954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerDied","Data":"e6f50d4dacd63a73e289ee94fe2585ec411ded08580d9d073e076fae45c8692b"} Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.196397 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerStarted","Data":"880cd035c4c9f5c897c9e4732f15afafb230b08d45ce385d0175f754dfa2fd3f"} Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.477388 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.575595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-492hd\" (UniqueName: \"kubernetes.io/projected/7470b455-8eb5-43ed-85cd-ad132974a76e-kube-api-access-492hd\") pod \"7470b455-8eb5-43ed-85cd-ad132974a76e\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.575726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-sg-core-conf-yaml\") pod \"7470b455-8eb5-43ed-85cd-ad132974a76e\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.575805 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-combined-ca-bundle\") pod \"7470b455-8eb5-43ed-85cd-ad132974a76e\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.575844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-config-data\") pod \"7470b455-8eb5-43ed-85cd-ad132974a76e\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.575867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-scripts\") pod \"7470b455-8eb5-43ed-85cd-ad132974a76e\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.575893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-log-httpd\") pod \"7470b455-8eb5-43ed-85cd-ad132974a76e\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.575914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-ceilometer-tls-certs\") pod \"7470b455-8eb5-43ed-85cd-ad132974a76e\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.575941 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-run-httpd\") pod \"7470b455-8eb5-43ed-85cd-ad132974a76e\" (UID: \"7470b455-8eb5-43ed-85cd-ad132974a76e\") " Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.576951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7470b455-8eb5-43ed-85cd-ad132974a76e" (UID: "7470b455-8eb5-43ed-85cd-ad132974a76e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.577287 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7470b455-8eb5-43ed-85cd-ad132974a76e" (UID: "7470b455-8eb5-43ed-85cd-ad132974a76e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.584660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-scripts" (OuterVolumeSpecName: "scripts") pod "7470b455-8eb5-43ed-85cd-ad132974a76e" (UID: "7470b455-8eb5-43ed-85cd-ad132974a76e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.584790 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7470b455-8eb5-43ed-85cd-ad132974a76e-kube-api-access-492hd" (OuterVolumeSpecName: "kube-api-access-492hd") pod "7470b455-8eb5-43ed-85cd-ad132974a76e" (UID: "7470b455-8eb5-43ed-85cd-ad132974a76e"). InnerVolumeSpecName "kube-api-access-492hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.631447 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7470b455-8eb5-43ed-85cd-ad132974a76e" (UID: "7470b455-8eb5-43ed-85cd-ad132974a76e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.673764 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7470b455-8eb5-43ed-85cd-ad132974a76e" (UID: "7470b455-8eb5-43ed-85cd-ad132974a76e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.679916 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-492hd\" (UniqueName: \"kubernetes.io/projected/7470b455-8eb5-43ed-85cd-ad132974a76e-kube-api-access-492hd\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.679951 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.679968 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.679981 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.679993 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.680001 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7470b455-8eb5-43ed-85cd-ad132974a76e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.705745 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7470b455-8eb5-43ed-85cd-ad132974a76e" (UID: "7470b455-8eb5-43ed-85cd-ad132974a76e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.727730 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-config-data" (OuterVolumeSpecName: "config-data") pod "7470b455-8eb5-43ed-85cd-ad132974a76e" (UID: "7470b455-8eb5-43ed-85cd-ad132974a76e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.782227 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:57 crc kubenswrapper[4707]: I0129 04:13:57.782259 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7470b455-8eb5-43ed-85cd-ad132974a76e-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.212628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7470b455-8eb5-43ed-85cd-ad132974a76e","Type":"ContainerDied","Data":"5a7a7ea86c9065ec8deda5836beaffecd06d8afc4a2c61ebbaf6f7288ccacf10"} Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.213080 4707 scope.go:117] "RemoveContainer" containerID="3365cdbdc6e7a511395c02dd233d4de53a208ff117c4d707c42d3390af860ec2" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.213274 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.266058 4707 scope.go:117] "RemoveContainer" containerID="4fcf6d6105ffe2ba61f0f3d63506f3c694c6c6b4f7d0dc7f810006808e80f883" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.304213 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.306775 4707 scope.go:117] "RemoveContainer" containerID="e6f50d4dacd63a73e289ee94fe2585ec411ded08580d9d073e076fae45c8692b" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.339139 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.346869 4707 scope.go:117] "RemoveContainer" containerID="4d0e2d9b1eec9cc61f9f0651e523a9e4d75212d47403824ecd3e4321067f63bc" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.351201 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 04:13:58 crc kubenswrapper[4707]: E0129 04:13:58.351833 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="ceilometer-central-agent" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.351853 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="ceilometer-central-agent" Jan 29 04:13:58 crc kubenswrapper[4707]: E0129 04:13:58.351865 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="sg-core" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.351934 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="sg-core" Jan 29 04:13:58 crc kubenswrapper[4707]: E0129 04:13:58.351960 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="ceilometer-notification-agent" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.351967 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="ceilometer-notification-agent" Jan 29 04:13:58 crc kubenswrapper[4707]: E0129 04:13:58.351989 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="proxy-httpd" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.351995 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="proxy-httpd" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.352248 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="ceilometer-central-agent" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.352263 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="sg-core" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.352294 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="proxy-httpd" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.352307 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" containerName="ceilometer-notification-agent" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.354513 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.356857 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.357371 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.357670 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.365716 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.406519 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c20a992-d535-46ad-9cc4-f2348c18f7ca-log-httpd\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.406624 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.406663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c20a992-d535-46ad-9cc4-f2348c18f7ca-run-httpd\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.406767 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2gnr\" (UniqueName: \"kubernetes.io/projected/7c20a992-d535-46ad-9cc4-f2348c18f7ca-kube-api-access-c2gnr\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.407008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.407042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-scripts\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.407065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.407158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-config-data\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.509761 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c20a992-d535-46ad-9cc4-f2348c18f7ca-run-httpd\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.510407 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gnr\" (UniqueName: \"kubernetes.io/projected/7c20a992-d535-46ad-9cc4-f2348c18f7ca-kube-api-access-c2gnr\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.510585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.510704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-scripts\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.510938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.511620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-config-data\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.510591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c20a992-d535-46ad-9cc4-f2348c18f7ca-run-httpd\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.511860 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c20a992-d535-46ad-9cc4-f2348c18f7ca-log-httpd\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.511965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.512194 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c20a992-d535-46ad-9cc4-f2348c18f7ca-log-httpd\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.515781 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-scripts\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.516341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.516407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-config-data\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.519189 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.519518 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c20a992-d535-46ad-9cc4-f2348c18f7ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.531033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gnr\" (UniqueName: \"kubernetes.io/projected/7c20a992-d535-46ad-9cc4-f2348c18f7ca-kube-api-access-c2gnr\") pod \"ceilometer-0\" (UID: \"7c20a992-d535-46ad-9cc4-f2348c18f7ca\") " pod="openstack/ceilometer-0" Jan 29 04:13:58 crc kubenswrapper[4707]: I0129 04:13:58.702319 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 04:13:59 crc kubenswrapper[4707]: I0129 04:13:59.204261 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 04:13:59 crc kubenswrapper[4707]: I0129 04:13:59.275915 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7470b455-8eb5-43ed-85cd-ad132974a76e" path="/var/lib/kubelet/pods/7470b455-8eb5-43ed-85cd-ad132974a76e/volumes" Jan 29 04:13:59 crc kubenswrapper[4707]: I0129 04:13:59.277030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c20a992-d535-46ad-9cc4-f2348c18f7ca","Type":"ContainerStarted","Data":"e79dbb4a626a042811f5b76aceb9a2959e9505ba45e22bcfc533e37cd19fba7a"} Jan 29 04:13:59 crc kubenswrapper[4707]: I0129 04:13:59.277070 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerStarted","Data":"aa7a834f4604e4c08243fedba3960346fbcefaaf36952fbe23f5a9d511cc9915"} Jan 29 04:14:00 crc kubenswrapper[4707]: I0129 04:14:00.283617 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c20a992-d535-46ad-9cc4-f2348c18f7ca","Type":"ContainerStarted","Data":"c527ae5082cf58199aed45518c6ff3a6372986d6662af30d535c32312dcf2a44"} Jan 29 04:14:00 crc kubenswrapper[4707]: I0129 04:14:00.286461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerStarted","Data":"5c54d45b4f9f23f0903408461c273b58b53221372e63c415464039f5821180b0"} Jan 29 04:14:00 crc kubenswrapper[4707]: I0129 04:14:00.286748 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-evaluator" containerID="cri-o://880cd035c4c9f5c897c9e4732f15afafb230b08d45ce385d0175f754dfa2fd3f" gracePeriod=30 Jan 29 04:14:00 crc kubenswrapper[4707]: I0129 04:14:00.286710 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-api" containerID="cri-o://075cc51254ac7c32ac05d354a2cafb5d2ec7ff354c438b5ed6e04d04600fa134" gracePeriod=30 Jan 29 04:14:00 crc kubenswrapper[4707]: I0129 04:14:00.286749 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-notifier" containerID="cri-o://aa7a834f4604e4c08243fedba3960346fbcefaaf36952fbe23f5a9d511cc9915" gracePeriod=30 Jan 29 04:14:00 crc kubenswrapper[4707]: I0129 04:14:00.286764 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-listener" containerID="cri-o://5c54d45b4f9f23f0903408461c273b58b53221372e63c415464039f5821180b0" gracePeriod=30 Jan 29 04:14:00 crc kubenswrapper[4707]: I0129 04:14:00.313876 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.695548965 podStartE2EDuration="7.313854685s" podCreationTimestamp="2026-01-29 04:13:53 +0000 UTC" firstStartedPulling="2026-01-29 04:13:54.143440437 +0000 UTC m=+2787.627669342" lastFinishedPulling="2026-01-29 04:13:59.761746147 +0000 UTC m=+2793.245975062" observedRunningTime="2026-01-29 04:14:00.309578446 +0000 UTC m=+2793.793807351" watchObservedRunningTime="2026-01-29 04:14:00.313854685 +0000 UTC m=+2793.798083590" Jan 29 04:14:01 crc kubenswrapper[4707]: I0129 04:14:01.317679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c20a992-d535-46ad-9cc4-f2348c18f7ca","Type":"ContainerStarted","Data":"4b36545c060fcf0ab37dd83fda0bd1cf1f53cba61b1d705bd5104975310d7a3e"} Jan 29 04:14:01 crc kubenswrapper[4707]: I0129 04:14:01.323197 4707 generic.go:334] "Generic (PLEG): container finished" podID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerID="aa7a834f4604e4c08243fedba3960346fbcefaaf36952fbe23f5a9d511cc9915" exitCode=0 Jan 29 04:14:01 crc kubenswrapper[4707]: I0129 04:14:01.323245 4707 generic.go:334] "Generic (PLEG): container finished" podID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerID="880cd035c4c9f5c897c9e4732f15afafb230b08d45ce385d0175f754dfa2fd3f" exitCode=0 Jan 29 04:14:01 crc kubenswrapper[4707]: I0129 04:14:01.323262 4707 generic.go:334] "Generic (PLEG): container finished" podID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerID="075cc51254ac7c32ac05d354a2cafb5d2ec7ff354c438b5ed6e04d04600fa134" exitCode=0 Jan 29 04:14:01 crc kubenswrapper[4707]: I0129 04:14:01.323295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerDied","Data":"aa7a834f4604e4c08243fedba3960346fbcefaaf36952fbe23f5a9d511cc9915"} Jan 29 04:14:01 crc kubenswrapper[4707]: I0129 04:14:01.323331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerDied","Data":"880cd035c4c9f5c897c9e4732f15afafb230b08d45ce385d0175f754dfa2fd3f"} Jan 29 04:14:01 crc kubenswrapper[4707]: I0129 04:14:01.323347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerDied","Data":"075cc51254ac7c32ac05d354a2cafb5d2ec7ff354c438b5ed6e04d04600fa134"} Jan 29 04:14:02 crc kubenswrapper[4707]: I0129 04:14:02.340888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c20a992-d535-46ad-9cc4-f2348c18f7ca","Type":"ContainerStarted","Data":"9084af5a685a71289054ba420394cfe965f3fb7b7da154ee0a71e6bfb42a821c"} Jan 29 04:14:04 crc kubenswrapper[4707]: I0129 04:14:04.364215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c20a992-d535-46ad-9cc4-f2348c18f7ca","Type":"ContainerStarted","Data":"09e9270ada5205d57866d932186cc2f9938721fc1c3ce3fa43ef4c54f78ff65d"} Jan 29 04:14:04 crc kubenswrapper[4707]: I0129 04:14:04.365185 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 04:14:04 crc kubenswrapper[4707]: I0129 04:14:04.410511 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.451307191 podStartE2EDuration="6.41048294s" podCreationTimestamp="2026-01-29 04:13:58 +0000 UTC" firstStartedPulling="2026-01-29 04:13:59.2121584 +0000 UTC m=+2792.696387315" lastFinishedPulling="2026-01-29 04:14:03.171334159 +0000 UTC m=+2796.655563064" observedRunningTime="2026-01-29 04:14:04.400404889 +0000 UTC m=+2797.884633804" watchObservedRunningTime="2026-01-29 04:14:04.41048294 +0000 UTC m=+2797.894711855" Jan 29 04:14:28 crc kubenswrapper[4707]: I0129 04:14:28.714830 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.784360 4707 generic.go:334] "Generic (PLEG): container finished" podID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerID="5c54d45b4f9f23f0903408461c273b58b53221372e63c415464039f5821180b0" exitCode=137 Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.784509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerDied","Data":"5c54d45b4f9f23f0903408461c273b58b53221372e63c415464039f5821180b0"} Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.784879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9118c663-76c2-40bf-b6c0-d72d12d6dca1","Type":"ContainerDied","Data":"1b7b1692b8afa5f66e39d8cab2e6eda92c58fa35d661c1d331153c81e7a43462"} Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.784898 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b7b1692b8afa5f66e39d8cab2e6eda92c58fa35d661c1d331153c81e7a43462" Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.785019 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.953260 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-config-data\") pod \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.953511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-scripts\") pod \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.953767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t69k4\" (UniqueName: \"kubernetes.io/projected/9118c663-76c2-40bf-b6c0-d72d12d6dca1-kube-api-access-t69k4\") pod \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.955375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-combined-ca-bundle\") pod \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\" (UID: \"9118c663-76c2-40bf-b6c0-d72d12d6dca1\") " Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.972854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-scripts" (OuterVolumeSpecName: "scripts") pod "9118c663-76c2-40bf-b6c0-d72d12d6dca1" (UID: "9118c663-76c2-40bf-b6c0-d72d12d6dca1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:14:30 crc kubenswrapper[4707]: I0129 04:14:30.979831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9118c663-76c2-40bf-b6c0-d72d12d6dca1-kube-api-access-t69k4" (OuterVolumeSpecName: "kube-api-access-t69k4") pod "9118c663-76c2-40bf-b6c0-d72d12d6dca1" (UID: "9118c663-76c2-40bf-b6c0-d72d12d6dca1"). InnerVolumeSpecName "kube-api-access-t69k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.061143 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.061490 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t69k4\" (UniqueName: \"kubernetes.io/projected/9118c663-76c2-40bf-b6c0-d72d12d6dca1-kube-api-access-t69k4\") on node \"crc\" DevicePath \"\"" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.070794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9118c663-76c2-40bf-b6c0-d72d12d6dca1" (UID: "9118c663-76c2-40bf-b6c0-d72d12d6dca1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.089818 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-config-data" (OuterVolumeSpecName: "config-data") pod "9118c663-76c2-40bf-b6c0-d72d12d6dca1" (UID: "9118c663-76c2-40bf-b6c0-d72d12d6dca1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.163309 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.163349 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118c663-76c2-40bf-b6c0-d72d12d6dca1-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.794198 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.824520 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.840608 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.859785 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 29 04:14:31 crc kubenswrapper[4707]: E0129 04:14:31.860176 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-notifier" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.860197 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-notifier" Jan 29 04:14:31 crc kubenswrapper[4707]: E0129 04:14:31.860220 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-evaluator" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.860229 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-evaluator" Jan 29 04:14:31 crc kubenswrapper[4707]: E0129 04:14:31.860246 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-listener" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.860254 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-listener" Jan 29 04:14:31 crc kubenswrapper[4707]: E0129 04:14:31.860268 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-api" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.860273 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-api" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.860474 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-api" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.860499 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-listener" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.860510 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-notifier" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.860520 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" containerName="aodh-evaluator" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.862123 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.865304 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.865453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.866024 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-8fk7t" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.866145 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.867589 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.900593 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.981254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-internal-tls-certs\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.981400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9594\" (UniqueName: \"kubernetes.io/projected/bccbabd7-0455-417c-865c-757fa8f011f0-kube-api-access-j9594\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.981423 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-public-tls-certs\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.981448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.981497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-scripts\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:31 crc kubenswrapper[4707]: I0129 04:14:31.981612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-config-data\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.083511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9594\" (UniqueName: \"kubernetes.io/projected/bccbabd7-0455-417c-865c-757fa8f011f0-kube-api-access-j9594\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.083591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-public-tls-certs\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.083635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.083702 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-scripts\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.083735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-config-data\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.083766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-internal-tls-certs\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.089233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.089523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-scripts\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.091873 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-config-data\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.092787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-public-tls-certs\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.096425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-internal-tls-certs\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.119387 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9594\" (UniqueName: \"kubernetes.io/projected/bccbabd7-0455-417c-865c-757fa8f011f0-kube-api-access-j9594\") pod \"aodh-0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.196826 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:14:32 crc kubenswrapper[4707]: W0129 04:14:32.842822 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbccbabd7_0455_417c_865c_757fa8f011f0.slice/crio-bd7ae5178739a4ed402d7ccf7e31a89ee3d282b619f0e53b87f1ea1b4d7652e5 WatchSource:0}: Error finding container bd7ae5178739a4ed402d7ccf7e31a89ee3d282b619f0e53b87f1ea1b4d7652e5: Status 404 returned error can't find the container with id bd7ae5178739a4ed402d7ccf7e31a89ee3d282b619f0e53b87f1ea1b4d7652e5 Jan 29 04:14:32 crc kubenswrapper[4707]: I0129 04:14:32.848999 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:14:33 crc kubenswrapper[4707]: I0129 04:14:33.259989 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9118c663-76c2-40bf-b6c0-d72d12d6dca1" path="/var/lib/kubelet/pods/9118c663-76c2-40bf-b6c0-d72d12d6dca1/volumes" Jan 29 04:14:33 crc kubenswrapper[4707]: I0129 04:14:33.814333 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerStarted","Data":"1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9"} Jan 29 04:14:33 crc kubenswrapper[4707]: I0129 04:14:33.814928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerStarted","Data":"bd7ae5178739a4ed402d7ccf7e31a89ee3d282b619f0e53b87f1ea1b4d7652e5"} Jan 29 04:14:34 crc kubenswrapper[4707]: I0129 04:14:34.829838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerStarted","Data":"7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651"} Jan 29 04:14:35 crc kubenswrapper[4707]: I0129 04:14:35.841672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerStarted","Data":"7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7"} Jan 29 04:14:35 crc kubenswrapper[4707]: I0129 04:14:35.842186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerStarted","Data":"1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087"} Jan 29 04:14:35 crc kubenswrapper[4707]: I0129 04:14:35.879610 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.371791066 podStartE2EDuration="4.879580361s" podCreationTimestamp="2026-01-29 04:14:31 +0000 UTC" firstStartedPulling="2026-01-29 04:14:32.844461089 +0000 UTC m=+2826.328689994" lastFinishedPulling="2026-01-29 04:14:35.352250374 +0000 UTC m=+2828.836479289" observedRunningTime="2026-01-29 04:14:35.877112112 +0000 UTC m=+2829.361341037" watchObservedRunningTime="2026-01-29 04:14:35.879580361 +0000 UTC m=+2829.363809286" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.158798 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5"] Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.161247 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.165581 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.166350 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.183522 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5"] Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.211251 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdtb\" (UniqueName: \"kubernetes.io/projected/89c119a1-197a-4413-a598-9eb61df0e072-kube-api-access-2pdtb\") pod \"collect-profiles-29494335-zzgz5\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.211573 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89c119a1-197a-4413-a598-9eb61df0e072-secret-volume\") pod \"collect-profiles-29494335-zzgz5\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.211665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89c119a1-197a-4413-a598-9eb61df0e072-config-volume\") pod \"collect-profiles-29494335-zzgz5\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.314405 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89c119a1-197a-4413-a598-9eb61df0e072-secret-volume\") pod \"collect-profiles-29494335-zzgz5\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.314518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89c119a1-197a-4413-a598-9eb61df0e072-config-volume\") pod \"collect-profiles-29494335-zzgz5\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.315990 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89c119a1-197a-4413-a598-9eb61df0e072-config-volume\") pod \"collect-profiles-29494335-zzgz5\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.316607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdtb\" (UniqueName: \"kubernetes.io/projected/89c119a1-197a-4413-a598-9eb61df0e072-kube-api-access-2pdtb\") pod \"collect-profiles-29494335-zzgz5\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.328938 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89c119a1-197a-4413-a598-9eb61df0e072-secret-volume\") pod \"collect-profiles-29494335-zzgz5\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.349314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdtb\" (UniqueName: \"kubernetes.io/projected/89c119a1-197a-4413-a598-9eb61df0e072-kube-api-access-2pdtb\") pod \"collect-profiles-29494335-zzgz5\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.490818 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:00 crc kubenswrapper[4707]: I0129 04:15:00.994446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5"] Jan 29 04:15:01 crc kubenswrapper[4707]: I0129 04:15:01.117294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" event={"ID":"89c119a1-197a-4413-a598-9eb61df0e072","Type":"ContainerStarted","Data":"193d0293653f1d55f5e5e50b0663118f062870eafbca73114463ea73d45a4fdd"} Jan 29 04:15:02 crc kubenswrapper[4707]: I0129 04:15:02.134727 4707 generic.go:334] "Generic (PLEG): container finished" podID="89c119a1-197a-4413-a598-9eb61df0e072" containerID="327cb10d00731f997c106eb616725764864b14013d6276fa43fa084bd5725a88" exitCode=0 Jan 29 04:15:02 crc kubenswrapper[4707]: I0129 04:15:02.134909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" event={"ID":"89c119a1-197a-4413-a598-9eb61df0e072","Type":"ContainerDied","Data":"327cb10d00731f997c106eb616725764864b14013d6276fa43fa084bd5725a88"} Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.463815 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.464528 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.587562 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.721055 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89c119a1-197a-4413-a598-9eb61df0e072-config-volume\") pod \"89c119a1-197a-4413-a598-9eb61df0e072\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.721163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89c119a1-197a-4413-a598-9eb61df0e072-secret-volume\") pod \"89c119a1-197a-4413-a598-9eb61df0e072\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.721268 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pdtb\" (UniqueName: \"kubernetes.io/projected/89c119a1-197a-4413-a598-9eb61df0e072-kube-api-access-2pdtb\") pod \"89c119a1-197a-4413-a598-9eb61df0e072\" (UID: \"89c119a1-197a-4413-a598-9eb61df0e072\") " Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.722120 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c119a1-197a-4413-a598-9eb61df0e072-config-volume" (OuterVolumeSpecName: "config-volume") pod "89c119a1-197a-4413-a598-9eb61df0e072" (UID: "89c119a1-197a-4413-a598-9eb61df0e072"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.727416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c119a1-197a-4413-a598-9eb61df0e072-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "89c119a1-197a-4413-a598-9eb61df0e072" (UID: "89c119a1-197a-4413-a598-9eb61df0e072"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.728021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c119a1-197a-4413-a598-9eb61df0e072-kube-api-access-2pdtb" (OuterVolumeSpecName: "kube-api-access-2pdtb") pod "89c119a1-197a-4413-a598-9eb61df0e072" (UID: "89c119a1-197a-4413-a598-9eb61df0e072"). InnerVolumeSpecName "kube-api-access-2pdtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.823883 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89c119a1-197a-4413-a598-9eb61df0e072-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.823926 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pdtb\" (UniqueName: \"kubernetes.io/projected/89c119a1-197a-4413-a598-9eb61df0e072-kube-api-access-2pdtb\") on node \"crc\" DevicePath \"\"" Jan 29 04:15:03 crc kubenswrapper[4707]: I0129 04:15:03.823936 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89c119a1-197a-4413-a598-9eb61df0e072-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 04:15:04 crc kubenswrapper[4707]: I0129 04:15:04.161602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" event={"ID":"89c119a1-197a-4413-a598-9eb61df0e072","Type":"ContainerDied","Data":"193d0293653f1d55f5e5e50b0663118f062870eafbca73114463ea73d45a4fdd"} Jan 29 04:15:04 crc kubenswrapper[4707]: I0129 04:15:04.161660 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="193d0293653f1d55f5e5e50b0663118f062870eafbca73114463ea73d45a4fdd" Jan 29 04:15:04 crc kubenswrapper[4707]: I0129 04:15:04.162147 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494335-zzgz5" Jan 29 04:15:04 crc kubenswrapper[4707]: I0129 04:15:04.677975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp"] Jan 29 04:15:04 crc kubenswrapper[4707]: I0129 04:15:04.689065 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494290-z58jp"] Jan 29 04:15:05 crc kubenswrapper[4707]: I0129 04:15:05.262738 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e997ce0-2a4c-4d1c-9191-4de1ab444f09" path="/var/lib/kubelet/pods/8e997ce0-2a4c-4d1c-9191-4de1ab444f09/volumes" Jan 29 04:15:33 crc kubenswrapper[4707]: I0129 04:15:33.463177 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:15:33 crc kubenswrapper[4707]: I0129 04:15:33.463814 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:15:40 crc kubenswrapper[4707]: I0129 04:15:40.236009 4707 scope.go:117] "RemoveContainer" containerID="6f377662679e8bc9b2f0eab54454145e3141084ebda1460ccef9a3576e7b597f" Jan 29 04:16:00 crc kubenswrapper[4707]: I0129 04:16:00.457664 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-67b9cbc75f-dv5cr" podUID="57f35f5f-1517-41b4-b354-59fd90d8fea5" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 29 04:16:03 crc kubenswrapper[4707]: I0129 04:16:03.463411 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:16:03 crc kubenswrapper[4707]: I0129 04:16:03.465235 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:16:03 crc kubenswrapper[4707]: I0129 04:16:03.465466 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 04:16:03 crc kubenswrapper[4707]: I0129 04:16:03.467526 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 04:16:03 crc kubenswrapper[4707]: I0129 04:16:03.467960 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" gracePeriod=600 Jan 29 04:16:03 crc kubenswrapper[4707]: E0129 04:16:03.652996 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:16:03 crc kubenswrapper[4707]: I0129 04:16:03.822620 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" exitCode=0 Jan 29 04:16:03 crc kubenswrapper[4707]: I0129 04:16:03.822709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7"} Jan 29 04:16:03 crc kubenswrapper[4707]: I0129 04:16:03.822800 4707 scope.go:117] "RemoveContainer" containerID="fbe00dcfbe047d7c82d09303891b3a5c6b17dc638addfcf5786780c3073d8600" Jan 29 04:16:03 crc kubenswrapper[4707]: I0129 04:16:03.823817 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:16:03 crc kubenswrapper[4707]: E0129 04:16:03.824370 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:16:15 crc kubenswrapper[4707]: I0129 04:16:15.245227 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:16:15 crc kubenswrapper[4707]: E0129 04:16:15.247215 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:16:29 crc kubenswrapper[4707]: I0129 04:16:29.246637 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:16:29 crc kubenswrapper[4707]: E0129 04:16:29.247880 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:16:40 crc kubenswrapper[4707]: I0129 04:16:40.244002 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:16:40 crc kubenswrapper[4707]: E0129 04:16:40.245034 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:16:53 crc kubenswrapper[4707]: I0129 04:16:53.244165 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:16:53 crc kubenswrapper[4707]: E0129 04:16:53.245448 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.109674 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n6x69"] Jan 29 04:17:03 crc kubenswrapper[4707]: E0129 04:17:03.111738 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c119a1-197a-4413-a598-9eb61df0e072" containerName="collect-profiles" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.111777 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c119a1-197a-4413-a598-9eb61df0e072" containerName="collect-profiles" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.112317 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c119a1-197a-4413-a598-9eb61df0e072" containerName="collect-profiles" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.138996 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6x69"] Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.139239 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.305503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-catalog-content\") pod \"certified-operators-n6x69\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.306060 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-utilities\") pod \"certified-operators-n6x69\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.306138 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q6m2\" (UniqueName: \"kubernetes.io/projected/e51c7beb-f330-4b5b-acdd-b2703ad3548e-kube-api-access-9q6m2\") pod \"certified-operators-n6x69\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.408690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-catalog-content\") pod \"certified-operators-n6x69\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.408804 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-utilities\") pod \"certified-operators-n6x69\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.409270 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-utilities\") pod \"certified-operators-n6x69\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.409330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q6m2\" (UniqueName: \"kubernetes.io/projected/e51c7beb-f330-4b5b-acdd-b2703ad3548e-kube-api-access-9q6m2\") pod \"certified-operators-n6x69\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.409332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-catalog-content\") pod \"certified-operators-n6x69\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.435023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q6m2\" (UniqueName: \"kubernetes.io/projected/e51c7beb-f330-4b5b-acdd-b2703ad3548e-kube-api-access-9q6m2\") pod \"certified-operators-n6x69\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:03 crc kubenswrapper[4707]: I0129 04:17:03.468933 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:04 crc kubenswrapper[4707]: I0129 04:17:03.995886 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6x69"] Jan 29 04:17:04 crc kubenswrapper[4707]: I0129 04:17:04.243550 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:17:04 crc kubenswrapper[4707]: E0129 04:17:04.244139 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:17:04 crc kubenswrapper[4707]: I0129 04:17:04.557858 4707 generic.go:334] "Generic (PLEG): container finished" podID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerID="b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8" exitCode=0 Jan 29 04:17:04 crc kubenswrapper[4707]: I0129 04:17:04.557917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6x69" event={"ID":"e51c7beb-f330-4b5b-acdd-b2703ad3548e","Type":"ContainerDied","Data":"b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8"} Jan 29 04:17:04 crc kubenswrapper[4707]: I0129 04:17:04.557955 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6x69" event={"ID":"e51c7beb-f330-4b5b-acdd-b2703ad3548e","Type":"ContainerStarted","Data":"4cc17cf28665a5dd0651583f16128a5944fbe50061afdedce580bb482710b390"} Jan 29 04:17:04 crc kubenswrapper[4707]: I0129 04:17:04.560361 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 04:17:05 crc kubenswrapper[4707]: I0129 04:17:05.568585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6x69" event={"ID":"e51c7beb-f330-4b5b-acdd-b2703ad3548e","Type":"ContainerStarted","Data":"01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d"} Jan 29 04:17:06 crc kubenswrapper[4707]: I0129 04:17:06.582451 4707 generic.go:334] "Generic (PLEG): container finished" podID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerID="01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d" exitCode=0 Jan 29 04:17:06 crc kubenswrapper[4707]: I0129 04:17:06.582582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6x69" event={"ID":"e51c7beb-f330-4b5b-acdd-b2703ad3548e","Type":"ContainerDied","Data":"01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d"} Jan 29 04:17:07 crc kubenswrapper[4707]: I0129 04:17:07.599634 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6x69" event={"ID":"e51c7beb-f330-4b5b-acdd-b2703ad3548e","Type":"ContainerStarted","Data":"1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84"} Jan 29 04:17:07 crc kubenswrapper[4707]: I0129 04:17:07.639754 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n6x69" podStartSLOduration=2.105303281 podStartE2EDuration="4.63973708s" podCreationTimestamp="2026-01-29 04:17:03 +0000 UTC" firstStartedPulling="2026-01-29 04:17:04.560072515 +0000 UTC m=+2978.044301430" lastFinishedPulling="2026-01-29 04:17:07.094506314 +0000 UTC m=+2980.578735229" observedRunningTime="2026-01-29 04:17:07.632530259 +0000 UTC m=+2981.116759174" watchObservedRunningTime="2026-01-29 04:17:07.63973708 +0000 UTC m=+2981.123965985" Jan 29 04:17:13 crc kubenswrapper[4707]: I0129 04:17:13.470069 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:13 crc kubenswrapper[4707]: I0129 04:17:13.470826 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:13 crc kubenswrapper[4707]: I0129 04:17:13.549699 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:13 crc kubenswrapper[4707]: I0129 04:17:13.720143 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:13 crc kubenswrapper[4707]: I0129 04:17:13.807237 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6x69"] Jan 29 04:17:15 crc kubenswrapper[4707]: I0129 04:17:15.244506 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:17:15 crc kubenswrapper[4707]: E0129 04:17:15.245241 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:17:15 crc kubenswrapper[4707]: I0129 04:17:15.693232 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n6x69" podUID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerName="registry-server" containerID="cri-o://1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84" gracePeriod=2 Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.230665 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.338888 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-catalog-content\") pod \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.339088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-utilities\") pod \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.339133 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q6m2\" (UniqueName: \"kubernetes.io/projected/e51c7beb-f330-4b5b-acdd-b2703ad3548e-kube-api-access-9q6m2\") pod \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\" (UID: \"e51c7beb-f330-4b5b-acdd-b2703ad3548e\") " Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.340262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-utilities" (OuterVolumeSpecName: "utilities") pod "e51c7beb-f330-4b5b-acdd-b2703ad3548e" (UID: "e51c7beb-f330-4b5b-acdd-b2703ad3548e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.348777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51c7beb-f330-4b5b-acdd-b2703ad3548e-kube-api-access-9q6m2" (OuterVolumeSpecName: "kube-api-access-9q6m2") pod "e51c7beb-f330-4b5b-acdd-b2703ad3548e" (UID: "e51c7beb-f330-4b5b-acdd-b2703ad3548e"). InnerVolumeSpecName "kube-api-access-9q6m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.409317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e51c7beb-f330-4b5b-acdd-b2703ad3548e" (UID: "e51c7beb-f330-4b5b-acdd-b2703ad3548e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.441184 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.441214 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e51c7beb-f330-4b5b-acdd-b2703ad3548e-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.441224 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q6m2\" (UniqueName: \"kubernetes.io/projected/e51c7beb-f330-4b5b-acdd-b2703ad3548e-kube-api-access-9q6m2\") on node \"crc\" DevicePath \"\"" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.704802 4707 generic.go:334] "Generic (PLEG): container finished" podID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerID="1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84" exitCode=0 Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.704886 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6x69" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.704911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6x69" event={"ID":"e51c7beb-f330-4b5b-acdd-b2703ad3548e","Type":"ContainerDied","Data":"1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84"} Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.705876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6x69" event={"ID":"e51c7beb-f330-4b5b-acdd-b2703ad3548e","Type":"ContainerDied","Data":"4cc17cf28665a5dd0651583f16128a5944fbe50061afdedce580bb482710b390"} Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.705952 4707 scope.go:117] "RemoveContainer" containerID="1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.730987 4707 scope.go:117] "RemoveContainer" containerID="01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.764478 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6x69"] Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.767860 4707 scope.go:117] "RemoveContainer" containerID="b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.772121 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n6x69"] Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.805442 4707 scope.go:117] "RemoveContainer" containerID="1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84" Jan 29 04:17:16 crc kubenswrapper[4707]: E0129 04:17:16.806005 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84\": container with ID starting with 1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84 not found: ID does not exist" containerID="1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.806061 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84"} err="failed to get container status \"1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84\": rpc error: code = NotFound desc = could not find container \"1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84\": container with ID starting with 1f024d2ab041ae4aad831db5215f127b3c5539a1cec7a728ae74489dea2cfc84 not found: ID does not exist" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.806099 4707 scope.go:117] "RemoveContainer" containerID="01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d" Jan 29 04:17:16 crc kubenswrapper[4707]: E0129 04:17:16.806525 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d\": container with ID starting with 01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d not found: ID does not exist" containerID="01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.806573 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d"} err="failed to get container status \"01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d\": rpc error: code = NotFound desc = could not find container \"01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d\": container with ID starting with 01991ea3711ace4838037ca3c777fa7223bd9db94764684137370ba8de79236d not found: ID does not exist" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.806597 4707 scope.go:117] "RemoveContainer" containerID="b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8" Jan 29 04:17:16 crc kubenswrapper[4707]: E0129 04:17:16.806980 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8\": container with ID starting with b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8 not found: ID does not exist" containerID="b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8" Jan 29 04:17:16 crc kubenswrapper[4707]: I0129 04:17:16.807045 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8"} err="failed to get container status \"b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8\": rpc error: code = NotFound desc = could not find container \"b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8\": container with ID starting with b28812151a06950b8615a221228392072a94cd503d97e9a6f475d6deaf7924c8 not found: ID does not exist" Jan 29 04:17:17 crc kubenswrapper[4707]: I0129 04:17:17.255977 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" path="/var/lib/kubelet/pods/e51c7beb-f330-4b5b-acdd-b2703ad3548e/volumes" Jan 29 04:17:26 crc kubenswrapper[4707]: I0129 04:17:26.245279 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:17:26 crc kubenswrapper[4707]: E0129 04:17:26.246720 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:17:33 crc kubenswrapper[4707]: I0129 04:17:33.186346 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7886d5cc69-w8rzq_0a32b73c-f66f-425f-81a9-ef1cc36041d4/manager/0.log" Jan 29 04:17:39 crc kubenswrapper[4707]: I0129 04:17:39.243326 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:17:39 crc kubenswrapper[4707]: E0129 04:17:39.244226 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.250092 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc"] Jan 29 04:17:46 crc kubenswrapper[4707]: E0129 04:17:46.251447 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerName="registry-server" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.251467 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerName="registry-server" Jan 29 04:17:46 crc kubenswrapper[4707]: E0129 04:17:46.251523 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerName="extract-content" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.251533 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerName="extract-content" Jan 29 04:17:46 crc kubenswrapper[4707]: E0129 04:17:46.251570 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerName="extract-utilities" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.251579 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerName="extract-utilities" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.251871 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51c7beb-f330-4b5b-acdd-b2703ad3548e" containerName="registry-server" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.254376 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.258414 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.269600 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc"] Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.332254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.332331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g77ks\" (UniqueName: \"kubernetes.io/projected/92ea08d2-9b03-4237-8606-3ce08e97a0a3-kube-api-access-g77ks\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.332383 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.433859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.433916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g77ks\" (UniqueName: \"kubernetes.io/projected/92ea08d2-9b03-4237-8606-3ce08e97a0a3-kube-api-access-g77ks\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.433963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.434421 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.434452 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.452880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g77ks\" (UniqueName: \"kubernetes.io/projected/92ea08d2-9b03-4237-8606-3ce08e97a0a3-kube-api-access-g77ks\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:46 crc kubenswrapper[4707]: I0129 04:17:46.581960 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:47 crc kubenswrapper[4707]: I0129 04:17:47.079683 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc"] Jan 29 04:17:48 crc kubenswrapper[4707]: I0129 04:17:48.032646 4707 generic.go:334] "Generic (PLEG): container finished" podID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerID="19b1e4dfc4c0814f347bb7cabcda567f2c1ce74868376f9683324750743835a5" exitCode=0 Jan 29 04:17:48 crc kubenswrapper[4707]: I0129 04:17:48.032706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" event={"ID":"92ea08d2-9b03-4237-8606-3ce08e97a0a3","Type":"ContainerDied","Data":"19b1e4dfc4c0814f347bb7cabcda567f2c1ce74868376f9683324750743835a5"} Jan 29 04:17:48 crc kubenswrapper[4707]: I0129 04:17:48.033755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" event={"ID":"92ea08d2-9b03-4237-8606-3ce08e97a0a3","Type":"ContainerStarted","Data":"eba0bc37dfe60f8c64708feecb1a665b2c6b707eb258af6b2d8b05e2acd907b9"} Jan 29 04:17:50 crc kubenswrapper[4707]: I0129 04:17:50.063375 4707 generic.go:334] "Generic (PLEG): container finished" podID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerID="9a0f2b6def493dd0fa7a5216d345884b84eb45d33096c8e19588154373836079" exitCode=0 Jan 29 04:17:50 crc kubenswrapper[4707]: I0129 04:17:50.063498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" event={"ID":"92ea08d2-9b03-4237-8606-3ce08e97a0a3","Type":"ContainerDied","Data":"9a0f2b6def493dd0fa7a5216d345884b84eb45d33096c8e19588154373836079"} Jan 29 04:17:51 crc kubenswrapper[4707]: I0129 04:17:51.079438 4707 generic.go:334] "Generic (PLEG): container finished" podID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerID="43b642b3d0ebdfc19959ad798390775d6a531d383b205497bbf0324b2b7239f4" exitCode=0 Jan 29 04:17:51 crc kubenswrapper[4707]: I0129 04:17:51.079533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" event={"ID":"92ea08d2-9b03-4237-8606-3ce08e97a0a3","Type":"ContainerDied","Data":"43b642b3d0ebdfc19959ad798390775d6a531d383b205497bbf0324b2b7239f4"} Jan 29 04:17:51 crc kubenswrapper[4707]: I0129 04:17:51.243845 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:17:51 crc kubenswrapper[4707]: E0129 04:17:51.244114 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.436171 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.571475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-util\") pod \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.571745 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g77ks\" (UniqueName: \"kubernetes.io/projected/92ea08d2-9b03-4237-8606-3ce08e97a0a3-kube-api-access-g77ks\") pod \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.571802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-bundle\") pod \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\" (UID: \"92ea08d2-9b03-4237-8606-3ce08e97a0a3\") " Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.574315 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-bundle" (OuterVolumeSpecName: "bundle") pod "92ea08d2-9b03-4237-8606-3ce08e97a0a3" (UID: "92ea08d2-9b03-4237-8606-3ce08e97a0a3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.578013 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ea08d2-9b03-4237-8606-3ce08e97a0a3-kube-api-access-g77ks" (OuterVolumeSpecName: "kube-api-access-g77ks") pod "92ea08d2-9b03-4237-8606-3ce08e97a0a3" (UID: "92ea08d2-9b03-4237-8606-3ce08e97a0a3"). InnerVolumeSpecName "kube-api-access-g77ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.585900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-util" (OuterVolumeSpecName: "util") pod "92ea08d2-9b03-4237-8606-3ce08e97a0a3" (UID: "92ea08d2-9b03-4237-8606-3ce08e97a0a3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.674779 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-util\") on node \"crc\" DevicePath \"\"" Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.674819 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g77ks\" (UniqueName: \"kubernetes.io/projected/92ea08d2-9b03-4237-8606-3ce08e97a0a3-kube-api-access-g77ks\") on node \"crc\" DevicePath \"\"" Jan 29 04:17:52 crc kubenswrapper[4707]: I0129 04:17:52.674832 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92ea08d2-9b03-4237-8606-3ce08e97a0a3-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:17:53 crc kubenswrapper[4707]: I0129 04:17:53.099674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" event={"ID":"92ea08d2-9b03-4237-8606-3ce08e97a0a3","Type":"ContainerDied","Data":"eba0bc37dfe60f8c64708feecb1a665b2c6b707eb258af6b2d8b05e2acd907b9"} Jan 29 04:17:53 crc kubenswrapper[4707]: I0129 04:17:53.099724 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eba0bc37dfe60f8c64708feecb1a665b2c6b707eb258af6b2d8b05e2acd907b9" Jan 29 04:17:53 crc kubenswrapper[4707]: I0129 04:17:53.099993 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.244500 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:18:03 crc kubenswrapper[4707]: E0129 04:18:03.245773 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.953224 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz"] Jan 29 04:18:03 crc kubenswrapper[4707]: E0129 04:18:03.954283 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerName="pull" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.954303 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerName="pull" Jan 29 04:18:03 crc kubenswrapper[4707]: E0129 04:18:03.954337 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerName="extract" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.954345 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerName="extract" Jan 29 04:18:03 crc kubenswrapper[4707]: E0129 04:18:03.954389 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerName="util" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.954397 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerName="util" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.954625 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ea08d2-9b03-4237-8606-3ce08e97a0a3" containerName="extract" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.955486 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.961883 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.962890 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-cj2cx" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.963292 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 29 04:18:03 crc kubenswrapper[4707]: I0129 04:18:03.973330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz"] Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.047145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn5lp\" (UniqueName: \"kubernetes.io/projected/aac893f7-df17-486c-895f-b5305b76bc60-kube-api-access-xn5lp\") pod \"obo-prometheus-operator-68bc856cb9-zggfz\" (UID: \"aac893f7-df17-486c-895f-b5305b76bc60\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.077636 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl"] Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.079444 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.082309 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-mt78c" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.084341 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.104360 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg"] Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.105791 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.140603 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl"] Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.168126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23eb4701-1c82-40e4-990b-87c4044f51cc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-dhgdl\" (UID: \"23eb4701-1c82-40e4-990b-87c4044f51cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.168682 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn5lp\" (UniqueName: \"kubernetes.io/projected/aac893f7-df17-486c-895f-b5305b76bc60-kube-api-access-xn5lp\") pod \"obo-prometheus-operator-68bc856cb9-zggfz\" (UID: \"aac893f7-df17-486c-895f-b5305b76bc60\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.168883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23eb4701-1c82-40e4-990b-87c4044f51cc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-dhgdl\" (UID: \"23eb4701-1c82-40e4-990b-87c4044f51cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.205052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn5lp\" (UniqueName: \"kubernetes.io/projected/aac893f7-df17-486c-895f-b5305b76bc60-kube-api-access-xn5lp\") pod \"obo-prometheus-operator-68bc856cb9-zggfz\" (UID: \"aac893f7-df17-486c-895f-b5305b76bc60\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.233690 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg"] Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.273368 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd346f51-b69b-4ac8-b4d3-d24201dd0015-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-qmjjg\" (UID: \"fd346f51-b69b-4ac8-b4d3-d24201dd0015\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.273474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23eb4701-1c82-40e4-990b-87c4044f51cc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-dhgdl\" (UID: \"23eb4701-1c82-40e4-990b-87c4044f51cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.273650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd346f51-b69b-4ac8-b4d3-d24201dd0015-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-qmjjg\" (UID: \"fd346f51-b69b-4ac8-b4d3-d24201dd0015\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.273704 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23eb4701-1c82-40e4-990b-87c4044f51cc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-dhgdl\" (UID: \"23eb4701-1c82-40e4-990b-87c4044f51cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.277831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23eb4701-1c82-40e4-990b-87c4044f51cc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-dhgdl\" (UID: \"23eb4701-1c82-40e4-990b-87c4044f51cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.289749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23eb4701-1c82-40e4-990b-87c4044f51cc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-dhgdl\" (UID: \"23eb4701-1c82-40e4-990b-87c4044f51cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.298454 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-gkwm8"] Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.300194 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.303149 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-rhtmh" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.303361 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.305063 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.349654 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-gkwm8"] Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.376511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd346f51-b69b-4ac8-b4d3-d24201dd0015-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-qmjjg\" (UID: \"fd346f51-b69b-4ac8-b4d3-d24201dd0015\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.376667 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb2589dc-26af-42a0-8fb9-8f908a0fbac9-observability-operator-tls\") pod \"observability-operator-59bdc8b94-gkwm8\" (UID: \"cb2589dc-26af-42a0-8fb9-8f908a0fbac9\") " pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.376870 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dq22\" (UniqueName: \"kubernetes.io/projected/cb2589dc-26af-42a0-8fb9-8f908a0fbac9-kube-api-access-4dq22\") pod \"observability-operator-59bdc8b94-gkwm8\" (UID: \"cb2589dc-26af-42a0-8fb9-8f908a0fbac9\") " pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.377024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd346f51-b69b-4ac8-b4d3-d24201dd0015-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-qmjjg\" (UID: \"fd346f51-b69b-4ac8-b4d3-d24201dd0015\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.384490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd346f51-b69b-4ac8-b4d3-d24201dd0015-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-qmjjg\" (UID: \"fd346f51-b69b-4ac8-b4d3-d24201dd0015\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.390145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd346f51-b69b-4ac8-b4d3-d24201dd0015-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd586f795-qmjjg\" (UID: \"fd346f51-b69b-4ac8-b4d3-d24201dd0015\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.398290 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.427343 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.486921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb2589dc-26af-42a0-8fb9-8f908a0fbac9-observability-operator-tls\") pod \"observability-operator-59bdc8b94-gkwm8\" (UID: \"cb2589dc-26af-42a0-8fb9-8f908a0fbac9\") " pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.487163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dq22\" (UniqueName: \"kubernetes.io/projected/cb2589dc-26af-42a0-8fb9-8f908a0fbac9-kube-api-access-4dq22\") pod \"observability-operator-59bdc8b94-gkwm8\" (UID: \"cb2589dc-26af-42a0-8fb9-8f908a0fbac9\") " pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.499404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb2589dc-26af-42a0-8fb9-8f908a0fbac9-observability-operator-tls\") pod \"observability-operator-59bdc8b94-gkwm8\" (UID: \"cb2589dc-26af-42a0-8fb9-8f908a0fbac9\") " pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.515119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dq22\" (UniqueName: \"kubernetes.io/projected/cb2589dc-26af-42a0-8fb9-8f908a0fbac9-kube-api-access-4dq22\") pod \"observability-operator-59bdc8b94-gkwm8\" (UID: \"cb2589dc-26af-42a0-8fb9-8f908a0fbac9\") " pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.554001 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-b6clh"] Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.556073 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.559569 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-clhz2" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.588333 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-b6clh"] Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.695635 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ab8ee44-5e15-42f6-861c-071cb82c90d4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-b6clh\" (UID: \"1ab8ee44-5e15-42f6-861c-071cb82c90d4\") " pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.695773 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xckgn\" (UniqueName: \"kubernetes.io/projected/1ab8ee44-5e15-42f6-861c-071cb82c90d4-kube-api-access-xckgn\") pod \"perses-operator-5bf474d74f-b6clh\" (UID: \"1ab8ee44-5e15-42f6-861c-071cb82c90d4\") " pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.796718 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.798026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ab8ee44-5e15-42f6-861c-071cb82c90d4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-b6clh\" (UID: \"1ab8ee44-5e15-42f6-861c-071cb82c90d4\") " pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.798182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xckgn\" (UniqueName: \"kubernetes.io/projected/1ab8ee44-5e15-42f6-861c-071cb82c90d4-kube-api-access-xckgn\") pod \"perses-operator-5bf474d74f-b6clh\" (UID: \"1ab8ee44-5e15-42f6-861c-071cb82c90d4\") " pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.799439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ab8ee44-5e15-42f6-861c-071cb82c90d4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-b6clh\" (UID: \"1ab8ee44-5e15-42f6-861c-071cb82c90d4\") " pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.820871 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xckgn\" (UniqueName: \"kubernetes.io/projected/1ab8ee44-5e15-42f6-861c-071cb82c90d4-kube-api-access-xckgn\") pod \"perses-operator-5bf474d74f-b6clh\" (UID: \"1ab8ee44-5e15-42f6-861c-071cb82c90d4\") " pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.889624 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:04 crc kubenswrapper[4707]: I0129 04:18:04.920619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz"] Jan 29 04:18:05 crc kubenswrapper[4707]: I0129 04:18:05.156520 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl"] Jan 29 04:18:05 crc kubenswrapper[4707]: W0129 04:18:05.164916 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23eb4701_1c82_40e4_990b_87c4044f51cc.slice/crio-77c3471c2845754f1c102c91a70bfcde90263b700d7372e615461e3386b97423 WatchSource:0}: Error finding container 77c3471c2845754f1c102c91a70bfcde90263b700d7372e615461e3386b97423: Status 404 returned error can't find the container with id 77c3471c2845754f1c102c91a70bfcde90263b700d7372e615461e3386b97423 Jan 29 04:18:05 crc kubenswrapper[4707]: I0129 04:18:05.170028 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg"] Jan 29 04:18:05 crc kubenswrapper[4707]: I0129 04:18:05.272605 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz" event={"ID":"aac893f7-df17-486c-895f-b5305b76bc60","Type":"ContainerStarted","Data":"7f64f859166dfbf0b8e6594dd9d70fcaa69b8f3fc0b4a297c32bc3984dbd0c5a"} Jan 29 04:18:05 crc kubenswrapper[4707]: I0129 04:18:05.272672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" event={"ID":"23eb4701-1c82-40e4-990b-87c4044f51cc","Type":"ContainerStarted","Data":"77c3471c2845754f1c102c91a70bfcde90263b700d7372e615461e3386b97423"} Jan 29 04:18:05 crc kubenswrapper[4707]: I0129 04:18:05.272971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" event={"ID":"fd346f51-b69b-4ac8-b4d3-d24201dd0015","Type":"ContainerStarted","Data":"cdbecf79d4b62db3834d5cc04802d53256cbd59490f35338f12cff5bac8ee059"} Jan 29 04:18:05 crc kubenswrapper[4707]: I0129 04:18:05.460727 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-gkwm8"] Jan 29 04:18:05 crc kubenswrapper[4707]: I0129 04:18:05.644267 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-b6clh"] Jan 29 04:18:06 crc kubenswrapper[4707]: I0129 04:18:06.291652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" event={"ID":"cb2589dc-26af-42a0-8fb9-8f908a0fbac9","Type":"ContainerStarted","Data":"1082f4ea6c92830dec5acad0bc0156f46378aa3d28915d0d749da0b9389b9317"} Jan 29 04:18:06 crc kubenswrapper[4707]: I0129 04:18:06.293754 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-b6clh" event={"ID":"1ab8ee44-5e15-42f6-861c-071cb82c90d4","Type":"ContainerStarted","Data":"009c8b455bab7e99b0faf0946189437e4537b1cb5ce28a66ef5d6c13a384d949"} Jan 29 04:18:14 crc kubenswrapper[4707]: I0129 04:18:14.244934 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:18:14 crc kubenswrapper[4707]: E0129 04:18:14.246103 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.520733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" event={"ID":"cb2589dc-26af-42a0-8fb9-8f908a0fbac9","Type":"ContainerStarted","Data":"b9321e5abd69bc0001821677988bbf120d82b5d35a4613f698b948a439c09827"} Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.521757 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.522943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz" event={"ID":"aac893f7-df17-486c-895f-b5305b76bc60","Type":"ContainerStarted","Data":"75bd699df37636b0bc895f5cdfaedc0e9e5a94b8e7b832772d30ea287452858e"} Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.524570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-b6clh" event={"ID":"1ab8ee44-5e15-42f6-861c-071cb82c90d4","Type":"ContainerStarted","Data":"83c904961cf397836b5a93ecc8a7d9272e6b7fc13c455ec6c4305e26c4298c94"} Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.524716 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.526365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" event={"ID":"23eb4701-1c82-40e4-990b-87c4044f51cc","Type":"ContainerStarted","Data":"068ade98b0764999fd768f86f7a8c760c9b8e678e43caacedc507b4b17fd16cc"} Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.528334 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" event={"ID":"fd346f51-b69b-4ac8-b4d3-d24201dd0015","Type":"ContainerStarted","Data":"0db15dfbc967a99f1215cf13d9db9221b2eb66d1f4f40fd04b61304a983e633f"} Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.561890 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" podStartSLOduration=2.566026923 podStartE2EDuration="15.561859968s" podCreationTimestamp="2026-01-29 04:18:04 +0000 UTC" firstStartedPulling="2026-01-29 04:18:05.491750303 +0000 UTC m=+3038.975979208" lastFinishedPulling="2026-01-29 04:18:18.487583348 +0000 UTC m=+3051.971812253" observedRunningTime="2026-01-29 04:18:19.551925231 +0000 UTC m=+3053.036154146" watchObservedRunningTime="2026-01-29 04:18:19.561859968 +0000 UTC m=+3053.046088873" Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.568496 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-gkwm8" Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.589975 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-dhgdl" podStartSLOduration=2.3347251079999998 podStartE2EDuration="15.589957103s" podCreationTimestamp="2026-01-29 04:18:04 +0000 UTC" firstStartedPulling="2026-01-29 04:18:05.17167092 +0000 UTC m=+3038.655899825" lastFinishedPulling="2026-01-29 04:18:18.426902915 +0000 UTC m=+3051.911131820" observedRunningTime="2026-01-29 04:18:19.587196976 +0000 UTC m=+3053.071425881" watchObservedRunningTime="2026-01-29 04:18:19.589957103 +0000 UTC m=+3053.074186008" Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.709035 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-b6clh" podStartSLOduration=2.882987507 podStartE2EDuration="15.709006585s" podCreationTimestamp="2026-01-29 04:18:04 +0000 UTC" firstStartedPulling="2026-01-29 04:18:05.662863407 +0000 UTC m=+3039.147092312" lastFinishedPulling="2026-01-29 04:18:18.488882485 +0000 UTC m=+3051.973111390" observedRunningTime="2026-01-29 04:18:19.693292786 +0000 UTC m=+3053.177521691" watchObservedRunningTime="2026-01-29 04:18:19.709006585 +0000 UTC m=+3053.193235490" Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.723705 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-zggfz" podStartSLOduration=3.18608175 podStartE2EDuration="16.723683745s" podCreationTimestamp="2026-01-29 04:18:03 +0000 UTC" firstStartedPulling="2026-01-29 04:18:04.952217466 +0000 UTC m=+3038.436446371" lastFinishedPulling="2026-01-29 04:18:18.489819461 +0000 UTC m=+3051.974048366" observedRunningTime="2026-01-29 04:18:19.651277584 +0000 UTC m=+3053.135506489" watchObservedRunningTime="2026-01-29 04:18:19.723683745 +0000 UTC m=+3053.207912650" Jan 29 04:18:19 crc kubenswrapper[4707]: I0129 04:18:19.797387 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd586f795-qmjjg" podStartSLOduration=2.553880134 podStartE2EDuration="15.797365141s" podCreationTimestamp="2026-01-29 04:18:04 +0000 UTC" firstStartedPulling="2026-01-29 04:18:05.163719108 +0000 UTC m=+3038.647948013" lastFinishedPulling="2026-01-29 04:18:18.407204115 +0000 UTC m=+3051.891433020" observedRunningTime="2026-01-29 04:18:19.744750082 +0000 UTC m=+3053.228978987" watchObservedRunningTime="2026-01-29 04:18:19.797365141 +0000 UTC m=+3053.281594036" Jan 29 04:18:24 crc kubenswrapper[4707]: I0129 04:18:24.892613 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-b6clh" Jan 29 04:18:29 crc kubenswrapper[4707]: I0129 04:18:29.243368 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:18:29 crc kubenswrapper[4707]: E0129 04:18:29.244237 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.054604 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.055234 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-api" containerID="cri-o://1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9" gracePeriod=30 Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.055341 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-listener" containerID="cri-o://7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7" gracePeriod=30 Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.055375 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-evaluator" containerID="cri-o://7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651" gracePeriod=30 Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.055360 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-notifier" containerID="cri-o://1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087" gracePeriod=30 Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.587570 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.590238 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.602993 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.603080 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.603209 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.603272 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.603273 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-lrjjn" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.639275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.669324 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh6vx\" (UniqueName: \"kubernetes.io/projected/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-kube-api-access-gh6vx\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.669593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.669652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.669741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.669763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.669811 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.669918 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.706243 4707 generic.go:334] "Generic (PLEG): container finished" podID="bccbabd7-0455-417c-865c-757fa8f011f0" containerID="7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651" exitCode=0 Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.706278 4707 generic.go:334] "Generic (PLEG): container finished" podID="bccbabd7-0455-417c-865c-757fa8f011f0" containerID="1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9" exitCode=0 Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.706301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerDied","Data":"7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651"} Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.706330 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerDied","Data":"1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9"} Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.772487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.772521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.772582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.772633 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.772687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh6vx\" (UniqueName: \"kubernetes.io/projected/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-kube-api-access-gh6vx\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.772778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.772798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.773750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.784075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.784531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.784640 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.787762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.789760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.796030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh6vx\" (UniqueName: \"kubernetes.io/projected/21d0ba1c-ab07-48da-8e34-93da9d1c9c6a-kube-api-access-gh6vx\") pod \"alertmanager-metric-storage-0\" (UID: \"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a\") " pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:32 crc kubenswrapper[4707]: I0129 04:18:32.958491 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.298914 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.301422 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.301264 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.304956 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-k8x5b" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.305038 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.305162 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.305223 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.305358 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.305470 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.306241 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.308243 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.382253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bm2k\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-kube-api-access-7bm2k\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.382364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.382855 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.382925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.382982 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.383009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.383039 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.383068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.383108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.383134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.484390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bm2k\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-kube-api-access-7bm2k\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.484819 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.484849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.484899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.484943 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.484967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.484999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.485022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.485051 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.485074 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.485719 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.485741 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.486062 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.492801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.492954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.497854 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.498580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.499012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.501163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bm2k\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-kube-api-access-7bm2k\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.508455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.528592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"prometheus-metric-storage-0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.633953 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.730554 4707 generic.go:334] "Generic (PLEG): container finished" podID="bccbabd7-0455-417c-865c-757fa8f011f0" containerID="7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7" exitCode=0 Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.730582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerDied","Data":"7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7"} Jan 29 04:18:33 crc kubenswrapper[4707]: I0129 04:18:33.786376 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 29 04:18:33 crc kubenswrapper[4707]: W0129 04:18:33.801704 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21d0ba1c_ab07_48da_8e34_93da9d1c9c6a.slice/crio-8db45a0fbee33549e6d649e2e816eb9cdeb68e59daa78b4214f3f8302c91a553 WatchSource:0}: Error finding container 8db45a0fbee33549e6d649e2e816eb9cdeb68e59daa78b4214f3f8302c91a553: Status 404 returned error can't find the container with id 8db45a0fbee33549e6d649e2e816eb9cdeb68e59daa78b4214f3f8302c91a553 Jan 29 04:18:34 crc kubenswrapper[4707]: I0129 04:18:34.020859 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:18:34 crc kubenswrapper[4707]: I0129 04:18:34.740094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a","Type":"ContainerStarted","Data":"8db45a0fbee33549e6d649e2e816eb9cdeb68e59daa78b4214f3f8302c91a553"} Jan 29 04:18:34 crc kubenswrapper[4707]: I0129 04:18:34.741647 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerStarted","Data":"5fce1216f37a865c034ce7bd8ee2af4d0aa8779009cf75d70f5fadd0e90c7629"} Jan 29 04:18:40 crc kubenswrapper[4707]: I0129 04:18:40.805909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerStarted","Data":"29f41c079460f385d2185e2cb0bdd2d6fd36798fa80877f00a0ba9b35b625cf6"} Jan 29 04:18:40 crc kubenswrapper[4707]: I0129 04:18:40.809461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a","Type":"ContainerStarted","Data":"5b553e8af12040022ed63c20f7cabdef1d47a8f245b23a5583654fdede40d223"} Jan 29 04:18:41 crc kubenswrapper[4707]: I0129 04:18:41.243648 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:18:41 crc kubenswrapper[4707]: E0129 04:18:41.243955 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.561196 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.688291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-internal-tls-certs\") pod \"bccbabd7-0455-417c-865c-757fa8f011f0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.688358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-scripts\") pod \"bccbabd7-0455-417c-865c-757fa8f011f0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.688516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-public-tls-certs\") pod \"bccbabd7-0455-417c-865c-757fa8f011f0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.688694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-combined-ca-bundle\") pod \"bccbabd7-0455-417c-865c-757fa8f011f0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.688724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9594\" (UniqueName: \"kubernetes.io/projected/bccbabd7-0455-417c-865c-757fa8f011f0-kube-api-access-j9594\") pod \"bccbabd7-0455-417c-865c-757fa8f011f0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.688785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-config-data\") pod \"bccbabd7-0455-417c-865c-757fa8f011f0\" (UID: \"bccbabd7-0455-417c-865c-757fa8f011f0\") " Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.708310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-scripts" (OuterVolumeSpecName: "scripts") pod "bccbabd7-0455-417c-865c-757fa8f011f0" (UID: "bccbabd7-0455-417c-865c-757fa8f011f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.708381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccbabd7-0455-417c-865c-757fa8f011f0-kube-api-access-j9594" (OuterVolumeSpecName: "kube-api-access-j9594") pod "bccbabd7-0455-417c-865c-757fa8f011f0" (UID: "bccbabd7-0455-417c-865c-757fa8f011f0"). InnerVolumeSpecName "kube-api-access-j9594". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.782250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bccbabd7-0455-417c-865c-757fa8f011f0" (UID: "bccbabd7-0455-417c-865c-757fa8f011f0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.790902 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.790943 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9594\" (UniqueName: \"kubernetes.io/projected/bccbabd7-0455-417c-865c-757fa8f011f0-kube-api-access-j9594\") on node \"crc\" DevicePath \"\"" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.790958 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.812658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bccbabd7-0455-417c-865c-757fa8f011f0" (UID: "bccbabd7-0455-417c-865c-757fa8f011f0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.837092 4707 generic.go:334] "Generic (PLEG): container finished" podID="bccbabd7-0455-417c-865c-757fa8f011f0" containerID="1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087" exitCode=0 Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.837151 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerDied","Data":"1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087"} Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.837184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bccbabd7-0455-417c-865c-757fa8f011f0","Type":"ContainerDied","Data":"bd7ae5178739a4ed402d7ccf7e31a89ee3d282b619f0e53b87f1ea1b4d7652e5"} Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.837205 4707 scope.go:117] "RemoveContainer" containerID="7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.837389 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.865321 4707 scope.go:117] "RemoveContainer" containerID="1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.880249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-config-data" (OuterVolumeSpecName: "config-data") pod "bccbabd7-0455-417c-865c-757fa8f011f0" (UID: "bccbabd7-0455-417c-865c-757fa8f011f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.903786 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.903826 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.903984 4707 scope.go:117] "RemoveContainer" containerID="7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.910331 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bccbabd7-0455-417c-865c-757fa8f011f0" (UID: "bccbabd7-0455-417c-865c-757fa8f011f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.934519 4707 scope.go:117] "RemoveContainer" containerID="1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.958171 4707 scope.go:117] "RemoveContainer" containerID="7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7" Jan 29 04:18:42 crc kubenswrapper[4707]: E0129 04:18:42.958774 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7\": container with ID starting with 7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7 not found: ID does not exist" containerID="7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.958827 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7"} err="failed to get container status \"7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7\": rpc error: code = NotFound desc = could not find container \"7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7\": container with ID starting with 7a74e76838ecb1d10a3722b39085ff1015affa70eaec52d98b4401fd51ba35b7 not found: ID does not exist" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.958850 4707 scope.go:117] "RemoveContainer" containerID="1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087" Jan 29 04:18:42 crc kubenswrapper[4707]: E0129 04:18:42.960088 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087\": container with ID starting with 1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087 not found: ID does not exist" containerID="1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.960127 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087"} err="failed to get container status \"1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087\": rpc error: code = NotFound desc = could not find container \"1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087\": container with ID starting with 1de236891751b35245aa8aeb6a2b37afa010386cdc5838fca73eaefbf91ba087 not found: ID does not exist" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.960158 4707 scope.go:117] "RemoveContainer" containerID="7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651" Jan 29 04:18:42 crc kubenswrapper[4707]: E0129 04:18:42.960701 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651\": container with ID starting with 7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651 not found: ID does not exist" containerID="7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.960794 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651"} err="failed to get container status \"7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651\": rpc error: code = NotFound desc = could not find container \"7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651\": container with ID starting with 7cb8688d0548ee6bdd41ecd3b12e645672447f491a83a4b72309825d4e121651 not found: ID does not exist" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.960852 4707 scope.go:117] "RemoveContainer" containerID="1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9" Jan 29 04:18:42 crc kubenswrapper[4707]: E0129 04:18:42.961351 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9\": container with ID starting with 1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9 not found: ID does not exist" containerID="1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9" Jan 29 04:18:42 crc kubenswrapper[4707]: I0129 04:18:42.961384 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9"} err="failed to get container status \"1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9\": rpc error: code = NotFound desc = could not find container \"1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9\": container with ID starting with 1f5cdd69155cd12a197d0fe5cdca1c73a8f1342648408d164a2fba65c2e1eee9 not found: ID does not exist" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.005588 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbabd7-0455-417c-865c-757fa8f011f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.187644 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.201200 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.212938 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 29 04:18:43 crc kubenswrapper[4707]: E0129 04:18:43.213589 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-listener" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.213611 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-listener" Jan 29 04:18:43 crc kubenswrapper[4707]: E0129 04:18:43.213632 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-evaluator" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.213644 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-evaluator" Jan 29 04:18:43 crc kubenswrapper[4707]: E0129 04:18:43.213664 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-notifier" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.213671 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-notifier" Jan 29 04:18:43 crc kubenswrapper[4707]: E0129 04:18:43.213726 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-api" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.213733 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-api" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.213957 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-evaluator" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.213977 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-api" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.213989 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-listener" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.214001 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" containerName="aodh-notifier" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.216461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.220207 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.220215 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.220625 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.220635 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.221862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.222641 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-8fk7t" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.269496 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccbabd7-0455-417c-865c-757fa8f011f0" path="/var/lib/kubelet/pods/bccbabd7-0455-417c-865c-757fa8f011f0/volumes" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.419188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-public-tls-certs\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.419570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-scripts\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.419739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-config-data\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.419942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.420043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-internal-tls-certs\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.420344 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftrfs\" (UniqueName: \"kubernetes.io/projected/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-kube-api-access-ftrfs\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.522791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-scripts\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.522909 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-config-data\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.522998 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.523050 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-internal-tls-certs\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.523181 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftrfs\" (UniqueName: \"kubernetes.io/projected/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-kube-api-access-ftrfs\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.523287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-public-tls-certs\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.532300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-public-tls-certs\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.532322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-internal-tls-certs\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.532611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.533373 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-config-data\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.550298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-scripts\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.552879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftrfs\" (UniqueName: \"kubernetes.io/projected/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-kube-api-access-ftrfs\") pod \"aodh-0\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " pod="openstack/aodh-0" Jan 29 04:18:43 crc kubenswrapper[4707]: I0129 04:18:43.593628 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:18:44 crc kubenswrapper[4707]: I0129 04:18:44.146157 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:18:44 crc kubenswrapper[4707]: I0129 04:18:44.859734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerStarted","Data":"50a810bc0be28c2c85a514e862f9844f29ab5ec15b04a558d7c09e33dacfacd4"} Jan 29 04:18:45 crc kubenswrapper[4707]: I0129 04:18:45.876019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerStarted","Data":"fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f"} Jan 29 04:18:45 crc kubenswrapper[4707]: I0129 04:18:45.876820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerStarted","Data":"2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469"} Jan 29 04:18:46 crc kubenswrapper[4707]: I0129 04:18:46.900617 4707 generic.go:334] "Generic (PLEG): container finished" podID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerID="29f41c079460f385d2185e2cb0bdd2d6fd36798fa80877f00a0ba9b35b625cf6" exitCode=0 Jan 29 04:18:46 crc kubenswrapper[4707]: I0129 04:18:46.900731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerDied","Data":"29f41c079460f385d2185e2cb0bdd2d6fd36798fa80877f00a0ba9b35b625cf6"} Jan 29 04:18:46 crc kubenswrapper[4707]: I0129 04:18:46.918063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerStarted","Data":"dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865"} Jan 29 04:18:47 crc kubenswrapper[4707]: I0129 04:18:47.932848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerStarted","Data":"83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3"} Jan 29 04:18:47 crc kubenswrapper[4707]: I0129 04:18:47.973980 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.887359139 podStartE2EDuration="4.973962547s" podCreationTimestamp="2026-01-29 04:18:43 +0000 UTC" firstStartedPulling="2026-01-29 04:18:44.151774201 +0000 UTC m=+3077.636003096" lastFinishedPulling="2026-01-29 04:18:47.238377589 +0000 UTC m=+3080.722606504" observedRunningTime="2026-01-29 04:18:47.971020955 +0000 UTC m=+3081.455249860" watchObservedRunningTime="2026-01-29 04:18:47.973962547 +0000 UTC m=+3081.458191452" Jan 29 04:18:48 crc kubenswrapper[4707]: I0129 04:18:48.950657 4707 generic.go:334] "Generic (PLEG): container finished" podID="21d0ba1c-ab07-48da-8e34-93da9d1c9c6a" containerID="5b553e8af12040022ed63c20f7cabdef1d47a8f245b23a5583654fdede40d223" exitCode=0 Jan 29 04:18:48 crc kubenswrapper[4707]: I0129 04:18:48.950748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a","Type":"ContainerDied","Data":"5b553e8af12040022ed63c20f7cabdef1d47a8f245b23a5583654fdede40d223"} Jan 29 04:18:55 crc kubenswrapper[4707]: I0129 04:18:55.029056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerStarted","Data":"f8f19396d6f2d620ad43f834b927a1eb9fcf835523329a5fd6cce130abc3324c"} Jan 29 04:18:55 crc kubenswrapper[4707]: I0129 04:18:55.032158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a","Type":"ContainerStarted","Data":"2da64699a4b2d9df81de9b346c7f4432ad166614fd0c90a5a0747920295abfe4"} Jan 29 04:18:55 crc kubenswrapper[4707]: I0129 04:18:55.244390 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:18:55 crc kubenswrapper[4707]: E0129 04:18:55.245068 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:18:59 crc kubenswrapper[4707]: I0129 04:18:59.070279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerStarted","Data":"49d2cc4780ffe0932c4e0f0e9379207a54689a710b13dbc0d38c7c7d9e94bd19"} Jan 29 04:19:00 crc kubenswrapper[4707]: I0129 04:19:00.083687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"21d0ba1c-ab07-48da-8e34-93da9d1c9c6a","Type":"ContainerStarted","Data":"b7339963305de79fd40180f6339f2dbb632124f8130618b709b0c4f829ba99e9"} Jan 29 04:19:00 crc kubenswrapper[4707]: I0129 04:19:00.084061 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Jan 29 04:19:00 crc kubenswrapper[4707]: I0129 04:19:00.087098 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Jan 29 04:19:00 crc kubenswrapper[4707]: I0129 04:19:00.114839 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.45936811 podStartE2EDuration="28.114807703s" podCreationTimestamp="2026-01-29 04:18:32 +0000 UTC" firstStartedPulling="2026-01-29 04:18:33.808039448 +0000 UTC m=+3067.292268353" lastFinishedPulling="2026-01-29 04:18:54.463479041 +0000 UTC m=+3087.947707946" observedRunningTime="2026-01-29 04:19:00.107726915 +0000 UTC m=+3093.591955830" watchObservedRunningTime="2026-01-29 04:19:00.114807703 +0000 UTC m=+3093.599036608" Jan 29 04:19:03 crc kubenswrapper[4707]: I0129 04:19:03.111948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerStarted","Data":"d56d12cadeeec88b14ea4ab85207d0e37ef39f62b3596bada68052f5e441c1fe"} Jan 29 04:19:03 crc kubenswrapper[4707]: I0129 04:19:03.147664 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.170042476 podStartE2EDuration="31.14764377s" podCreationTimestamp="2026-01-29 04:18:32 +0000 UTC" firstStartedPulling="2026-01-29 04:18:34.036060031 +0000 UTC m=+3067.520288936" lastFinishedPulling="2026-01-29 04:19:02.013661325 +0000 UTC m=+3095.497890230" observedRunningTime="2026-01-29 04:19:03.136096838 +0000 UTC m=+3096.620325743" watchObservedRunningTime="2026-01-29 04:19:03.14764377 +0000 UTC m=+3096.631872675" Jan 29 04:19:03 crc kubenswrapper[4707]: I0129 04:19:03.634916 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:03 crc kubenswrapper[4707]: I0129 04:19:03.634973 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:03 crc kubenswrapper[4707]: I0129 04:19:03.637841 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:04 crc kubenswrapper[4707]: I0129 04:19:04.124050 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.727114 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.728094 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" containerName="openstackclient" containerID="cri-o://cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8" gracePeriod=2 Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.738016 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.770753 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 29 04:19:05 crc kubenswrapper[4707]: E0129 04:19:05.771245 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" containerName="openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.771265 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" containerName="openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.771454 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" containerName="openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.772269 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.776466 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" podUID="2afcd46a-a1c0-41cf-866e-3a39e0ac9a36" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.797485 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.870395 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jkct\" (UniqueName: \"kubernetes.io/projected/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-kube-api-access-2jkct\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.870590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-openstack-config-secret\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.870639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-openstack-config\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.870682 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.972763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jkct\" (UniqueName: \"kubernetes.io/projected/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-kube-api-access-2jkct\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.972856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-openstack-config-secret\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.972892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-openstack-config\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.972918 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.974104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-openstack-config\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.980016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-openstack-config-secret\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.980031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:05 crc kubenswrapper[4707]: I0129 04:19:05.989875 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jkct\" (UniqueName: \"kubernetes.io/projected/2afcd46a-a1c0-41cf-866e-3a39e0ac9a36-kube-api-access-2jkct\") pod \"openstackclient\" (UID: \"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36\") " pod="openstack/openstackclient" Jan 29 04:19:06 crc kubenswrapper[4707]: I0129 04:19:06.089386 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 04:19:06 crc kubenswrapper[4707]: I0129 04:19:06.144778 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 29 04:19:06 crc kubenswrapper[4707]: I0129 04:19:06.145058 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-api" containerID="cri-o://2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469" gracePeriod=30 Jan 29 04:19:06 crc kubenswrapper[4707]: I0129 04:19:06.145582 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-evaluator" containerID="cri-o://fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f" gracePeriod=30 Jan 29 04:19:06 crc kubenswrapper[4707]: I0129 04:19:06.145513 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-notifier" containerID="cri-o://dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865" gracePeriod=30 Jan 29 04:19:06 crc kubenswrapper[4707]: I0129 04:19:06.145726 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-listener" containerID="cri-o://83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3" gracePeriod=30 Jan 29 04:19:06 crc kubenswrapper[4707]: I0129 04:19:06.706862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 04:19:06 crc kubenswrapper[4707]: W0129 04:19:06.717396 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2afcd46a_a1c0_41cf_866e_3a39e0ac9a36.slice/crio-6ce46b009ff98e13a2a4686bbb2328007c757f9093f5262ae07e1e50a0f9db4b WatchSource:0}: Error finding container 6ce46b009ff98e13a2a4686bbb2328007c757f9093f5262ae07e1e50a0f9db4b: Status 404 returned error can't find the container with id 6ce46b009ff98e13a2a4686bbb2328007c757f9093f5262ae07e1e50a0f9db4b Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.083966 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.223654 4707 generic.go:334] "Generic (PLEG): container finished" podID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerID="fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f" exitCode=0 Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.223707 4707 generic.go:334] "Generic (PLEG): container finished" podID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerID="2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469" exitCode=0 Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.223813 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerDied","Data":"fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f"} Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.223848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerDied","Data":"2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469"} Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.226762 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36","Type":"ContainerStarted","Data":"ada32babc500017550ff5246411a6393f966393b60750715fa9ecd322c1c358e"} Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.226826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2afcd46a-a1c0-41cf-866e-3a39e0ac9a36","Type":"ContainerStarted","Data":"6ce46b009ff98e13a2a4686bbb2328007c757f9093f5262ae07e1e50a0f9db4b"} Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.226850 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="prometheus" containerID="cri-o://f8f19396d6f2d620ad43f834b927a1eb9fcf835523329a5fd6cce130abc3324c" gracePeriod=600 Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.227000 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="thanos-sidecar" containerID="cri-o://d56d12cadeeec88b14ea4ab85207d0e37ef39f62b3596bada68052f5e441c1fe" gracePeriod=600 Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.227070 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="config-reloader" containerID="cri-o://49d2cc4780ffe0932c4e0f0e9379207a54689a710b13dbc0d38c7c7d9e94bd19" gracePeriod=600 Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.249395 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:19:07 crc kubenswrapper[4707]: E0129 04:19:07.250019 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.280654 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.2806351 podStartE2EDuration="2.2806351s" podCreationTimestamp="2026-01-29 04:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:19:07.257069402 +0000 UTC m=+3100.741298307" watchObservedRunningTime="2026-01-29 04:19:07.2806351 +0000 UTC m=+3100.764864005" Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.970456 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 04:19:07 crc kubenswrapper[4707]: I0129 04:19:07.978749 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" podUID="2afcd46a-a1c0-41cf-866e-3a39e0ac9a36" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.120866 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-combined-ca-bundle\") pod \"12d355a2-5cc3-43c5-96b0-b11f83de869d\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.121410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config\") pod \"12d355a2-5cc3-43c5-96b0-b11f83de869d\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.121519 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52grj\" (UniqueName: \"kubernetes.io/projected/12d355a2-5cc3-43c5-96b0-b11f83de869d-kube-api-access-52grj\") pod \"12d355a2-5cc3-43c5-96b0-b11f83de869d\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.121632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config-secret\") pod \"12d355a2-5cc3-43c5-96b0-b11f83de869d\" (UID: \"12d355a2-5cc3-43c5-96b0-b11f83de869d\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.150136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12d355a2-5cc3-43c5-96b0-b11f83de869d-kube-api-access-52grj" (OuterVolumeSpecName: "kube-api-access-52grj") pod "12d355a2-5cc3-43c5-96b0-b11f83de869d" (UID: "12d355a2-5cc3-43c5-96b0-b11f83de869d"). InnerVolumeSpecName "kube-api-access-52grj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.157713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "12d355a2-5cc3-43c5-96b0-b11f83de869d" (UID: "12d355a2-5cc3-43c5-96b0-b11f83de869d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.192480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12d355a2-5cc3-43c5-96b0-b11f83de869d" (UID: "12d355a2-5cc3-43c5-96b0-b11f83de869d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.214462 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "12d355a2-5cc3-43c5-96b0-b11f83de869d" (UID: "12d355a2-5cc3-43c5-96b0-b11f83de869d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.224639 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.224733 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.224787 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52grj\" (UniqueName: \"kubernetes.io/projected/12d355a2-5cc3-43c5-96b0-b11f83de869d-kube-api-access-52grj\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.224858 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12d355a2-5cc3-43c5-96b0-b11f83de869d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.243997 4707 generic.go:334] "Generic (PLEG): container finished" podID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerID="d56d12cadeeec88b14ea4ab85207d0e37ef39f62b3596bada68052f5e441c1fe" exitCode=0 Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.244032 4707 generic.go:334] "Generic (PLEG): container finished" podID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerID="49d2cc4780ffe0932c4e0f0e9379207a54689a710b13dbc0d38c7c7d9e94bd19" exitCode=0 Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.244039 4707 generic.go:334] "Generic (PLEG): container finished" podID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerID="f8f19396d6f2d620ad43f834b927a1eb9fcf835523329a5fd6cce130abc3324c" exitCode=0 Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.244093 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerDied","Data":"d56d12cadeeec88b14ea4ab85207d0e37ef39f62b3596bada68052f5e441c1fe"} Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.244145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerDied","Data":"49d2cc4780ffe0932c4e0f0e9379207a54689a710b13dbc0d38c7c7d9e94bd19"} Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.244161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerDied","Data":"f8f19396d6f2d620ad43f834b927a1eb9fcf835523329a5fd6cce130abc3324c"} Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.244181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03149070-c2c3-42b3-a0a1-246ecd8c46c0","Type":"ContainerDied","Data":"5fce1216f37a865c034ce7bd8ee2af4d0aa8779009cf75d70f5fadd0e90c7629"} Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.244196 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fce1216f37a865c034ce7bd8ee2af4d0aa8779009cf75d70f5fadd0e90c7629" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.251382 4707 generic.go:334] "Generic (PLEG): container finished" podID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerID="dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865" exitCode=0 Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.251458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerDied","Data":"dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865"} Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.253423 4707 generic.go:334] "Generic (PLEG): container finished" podID="12d355a2-5cc3-43c5-96b0-b11f83de869d" containerID="cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8" exitCode=137 Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.254824 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.255680 4707 scope.go:117] "RemoveContainer" containerID="cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.259806 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" podUID="2afcd46a-a1c0-41cf-866e-3a39e0ac9a36" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.309428 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" podUID="2afcd46a-a1c0-41cf-866e-3a39e0ac9a36" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.311808 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.328936 4707 scope.go:117] "RemoveContainer" containerID="cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8" Jan 29 04:19:08 crc kubenswrapper[4707]: E0129 04:19:08.329603 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8\": container with ID starting with cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8 not found: ID does not exist" containerID="cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.329658 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8"} err="failed to get container status \"cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8\": rpc error: code = NotFound desc = could not find container \"cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8\": container with ID starting with cf58b4329c514be887be9e13d54f9f792a756234df1648ed7dc587e26d5012e8 not found: ID does not exist" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config-out\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432362 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-web-config\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432455 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-0\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-2\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-1\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bm2k\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-kube-api-access-7bm2k\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432857 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-tls-assets\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-thanos-prometheus-http-client-file\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.432931 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\" (UID: \"03149070-c2c3-42b3-a0a1-246ecd8c46c0\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.438236 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.438882 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.439171 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.440781 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.441091 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config" (OuterVolumeSpecName: "config") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.443662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.445635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.446814 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config-out" (OuterVolumeSpecName: "config-out") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.448613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-kube-api-access-7bm2k" (OuterVolumeSpecName: "kube-api-access-7bm2k") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "kube-api-access-7bm2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.468019 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-web-config" (OuterVolumeSpecName: "web-config") pod "03149070-c2c3-42b3-a0a1-246ecd8c46c0" (UID: "03149070-c2c3-42b3-a0a1-246ecd8c46c0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.535471 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.535910 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.535923 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.535937 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bm2k\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-kube-api-access-7bm2k\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.535947 4707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03149070-c2c3-42b3-a0a1-246ecd8c46c0-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.535956 4707 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.535994 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.536005 4707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03149070-c2c3-42b3-a0a1-246ecd8c46c0-config-out\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.536014 4707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03149070-c2c3-42b3-a0a1-246ecd8c46c0-web-config\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.536024 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03149070-c2c3-42b3-a0a1-246ecd8c46c0-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.561125 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.640242 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.793707 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.946862 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-combined-ca-bundle\") pod \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.947002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-scripts\") pod \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.947852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-config-data\") pod \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.948036 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftrfs\" (UniqueName: \"kubernetes.io/projected/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-kube-api-access-ftrfs\") pod \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.948067 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-internal-tls-certs\") pod \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.948121 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-public-tls-certs\") pod \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\" (UID: \"ccbcf7c5-84ab-4303-a31c-d99b0162cd42\") " Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.951702 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-scripts" (OuterVolumeSpecName: "scripts") pod "ccbcf7c5-84ab-4303-a31c-d99b0162cd42" (UID: "ccbcf7c5-84ab-4303-a31c-d99b0162cd42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:08 crc kubenswrapper[4707]: I0129 04:19:08.953052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-kube-api-access-ftrfs" (OuterVolumeSpecName: "kube-api-access-ftrfs") pod "ccbcf7c5-84ab-4303-a31c-d99b0162cd42" (UID: "ccbcf7c5-84ab-4303-a31c-d99b0162cd42"). InnerVolumeSpecName "kube-api-access-ftrfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.004326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ccbcf7c5-84ab-4303-a31c-d99b0162cd42" (UID: "ccbcf7c5-84ab-4303-a31c-d99b0162cd42"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.005699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ccbcf7c5-84ab-4303-a31c-d99b0162cd42" (UID: "ccbcf7c5-84ab-4303-a31c-d99b0162cd42"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.049608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-config-data" (OuterVolumeSpecName: "config-data") pod "ccbcf7c5-84ab-4303-a31c-d99b0162cd42" (UID: "ccbcf7c5-84ab-4303-a31c-d99b0162cd42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.050659 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.050684 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftrfs\" (UniqueName: \"kubernetes.io/projected/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-kube-api-access-ftrfs\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.050694 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.050703 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.050713 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.059574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccbcf7c5-84ab-4303-a31c-d99b0162cd42" (UID: "ccbcf7c5-84ab-4303-a31c-d99b0162cd42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.152666 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcf7c5-84ab-4303-a31c-d99b0162cd42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.259103 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12d355a2-5cc3-43c5-96b0-b11f83de869d" path="/var/lib/kubelet/pods/12d355a2-5cc3-43c5-96b0-b11f83de869d/volumes" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.272601 4707 generic.go:334] "Generic (PLEG): container finished" podID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerID="83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3" exitCode=0 Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.272717 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.274158 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.276919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerDied","Data":"83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3"} Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.276972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ccbcf7c5-84ab-4303-a31c-d99b0162cd42","Type":"ContainerDied","Data":"50a810bc0be28c2c85a514e862f9844f29ab5ec15b04a558d7c09e33dacfacd4"} Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.276993 4707 scope.go:117] "RemoveContainer" containerID="83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.309816 4707 scope.go:117] "RemoveContainer" containerID="dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.342150 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.353083 4707 scope.go:117] "RemoveContainer" containerID="fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.370750 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.383672 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.384161 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="prometheus" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384173 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="prometheus" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.384186 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="thanos-sidecar" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384191 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="thanos-sidecar" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.384206 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-notifier" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384212 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-notifier" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.384220 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-listener" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384227 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-listener" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.384241 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="config-reloader" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384247 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="config-reloader" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.384255 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-evaluator" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384260 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-evaluator" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.384271 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-api" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384278 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-api" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.384311 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="init-config-reloader" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384318 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="init-config-reloader" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384566 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-listener" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384583 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="prometheus" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384601 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-evaluator" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384613 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-api" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384622 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="thanos-sidecar" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384630 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" containerName="aodh-notifier" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.384643 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" containerName="config-reloader" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.386516 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.388988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-8fk7t" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.391986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.392059 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.392320 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.393133 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.396677 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.410122 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.419829 4707 scope.go:117] "RemoveContainer" containerID="2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.423172 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.458323 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.462238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-config-data\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.462313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-scripts\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.462438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-public-tls-certs\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.462465 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-internal-tls-certs\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.462494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.462528 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snhn9\" (UniqueName: \"kubernetes.io/projected/701e271e-81d3-4a93-a724-761ec5a242f6-kube-api-access-snhn9\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.475447 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.481320 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-k8x5b" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.481639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.481704 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.481755 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.482011 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.493953 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.498997 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.499064 4707 scope.go:117] "RemoveContainer" containerID="83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.499251 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.499316 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.499366 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.513704 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3\": container with ID starting with 83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3 not found: ID does not exist" containerID="83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.513761 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3"} err="failed to get container status \"83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3\": rpc error: code = NotFound desc = could not find container \"83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3\": container with ID starting with 83ab279d2b94e49c60e0cdda0ce16e8f16b390477193f8522e1a394af7208ba3 not found: ID does not exist" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.513789 4707 scope.go:117] "RemoveContainer" containerID="dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.531723 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865\": container with ID starting with dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865 not found: ID does not exist" containerID="dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.531772 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865"} err="failed to get container status \"dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865\": rpc error: code = NotFound desc = could not find container \"dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865\": container with ID starting with dbf2aa30b92607c6867d35b1dcf1ef6eaadfd9a40a301142917ef6b93b91c865 not found: ID does not exist" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.531804 4707 scope.go:117] "RemoveContainer" containerID="fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.535123 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f\": container with ID starting with fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f not found: ID does not exist" containerID="fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.535340 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f"} err="failed to get container status \"fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f\": rpc error: code = NotFound desc = could not find container \"fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f\": container with ID starting with fb0bd3c0ebd764eea18b68a96d3a904903350aa16188f95196851f6130b71e0f not found: ID does not exist" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.535362 4707 scope.go:117] "RemoveContainer" containerID="2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469" Jan 29 04:19:09 crc kubenswrapper[4707]: E0129 04:19:09.537004 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469\": container with ID starting with 2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469 not found: ID does not exist" containerID="2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.537028 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469"} err="failed to get container status \"2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469\": rpc error: code = NotFound desc = could not find container \"2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469\": container with ID starting with 2828f87c7143484142145122659bff9093a9a231d2e2f7ebb38f056328aee469 not found: ID does not exist" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.566005 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-public-tls-certs\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.566265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-internal-tls-certs\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.566352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.566438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snhn9\" (UniqueName: \"kubernetes.io/projected/701e271e-81d3-4a93-a724-761ec5a242f6-kube-api-access-snhn9\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.566646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-config-data\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.566734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-scripts\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.580591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-public-tls-certs\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.598469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-scripts\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.599249 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-config-data\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.601158 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snhn9\" (UniqueName: \"kubernetes.io/projected/701e271e-81d3-4a93-a724-761ec5a242f6-kube-api-access-snhn9\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.606135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-internal-tls-certs\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.606172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670551 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgssg\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-kube-api-access-lgssg\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670815 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670907 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.670956 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.721231 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773422 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgssg\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-kube-api-access-lgssg\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773480 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.773940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.774947 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.775777 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.777791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.778281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.780339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.780771 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.781343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.782084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.782452 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.782708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.788121 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.788619 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.800184 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgssg\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-kube-api-access-lgssg\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:09 crc kubenswrapper[4707]: I0129 04:19:09.817585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:10 crc kubenswrapper[4707]: I0129 04:19:10.115186 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:10 crc kubenswrapper[4707]: I0129 04:19:10.216049 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:19:10 crc kubenswrapper[4707]: W0129 04:19:10.216730 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod701e271e_81d3_4a93_a724_761ec5a242f6.slice/crio-d5fe1423918f026ec492752c9f9f61962a6ca895e3dae71755ae9763172d5b51 WatchSource:0}: Error finding container d5fe1423918f026ec492752c9f9f61962a6ca895e3dae71755ae9763172d5b51: Status 404 returned error can't find the container with id d5fe1423918f026ec492752c9f9f61962a6ca895e3dae71755ae9763172d5b51 Jan 29 04:19:10 crc kubenswrapper[4707]: I0129 04:19:10.303493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerStarted","Data":"d5fe1423918f026ec492752c9f9f61962a6ca895e3dae71755ae9763172d5b51"} Jan 29 04:19:10 crc kubenswrapper[4707]: W0129 04:19:10.632366 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8ea8c44_a3ed_4acb_acff_a2fa113e1d79.slice/crio-0bfded254f8d63cdea78997e5d7b6d588a529f5d0f86c32c61bc31a7ec0e5fa6 WatchSource:0}: Error finding container 0bfded254f8d63cdea78997e5d7b6d588a529f5d0f86c32c61bc31a7ec0e5fa6: Status 404 returned error can't find the container with id 0bfded254f8d63cdea78997e5d7b6d588a529f5d0f86c32c61bc31a7ec0e5fa6 Jan 29 04:19:10 crc kubenswrapper[4707]: I0129 04:19:10.634390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:19:11 crc kubenswrapper[4707]: I0129 04:19:11.255258 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03149070-c2c3-42b3-a0a1-246ecd8c46c0" path="/var/lib/kubelet/pods/03149070-c2c3-42b3-a0a1-246ecd8c46c0/volumes" Jan 29 04:19:11 crc kubenswrapper[4707]: I0129 04:19:11.256259 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccbcf7c5-84ab-4303-a31c-d99b0162cd42" path="/var/lib/kubelet/pods/ccbcf7c5-84ab-4303-a31c-d99b0162cd42/volumes" Jan 29 04:19:11 crc kubenswrapper[4707]: I0129 04:19:11.328078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerStarted","Data":"a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498"} Jan 29 04:19:11 crc kubenswrapper[4707]: I0129 04:19:11.332964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerStarted","Data":"0bfded254f8d63cdea78997e5d7b6d588a529f5d0f86c32c61bc31a7ec0e5fa6"} Jan 29 04:19:12 crc kubenswrapper[4707]: I0129 04:19:12.347181 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerStarted","Data":"1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22"} Jan 29 04:19:13 crc kubenswrapper[4707]: I0129 04:19:13.361739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerStarted","Data":"4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a"} Jan 29 04:19:14 crc kubenswrapper[4707]: I0129 04:19:14.383950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerStarted","Data":"a6f6c2d8da4761b1ff682d9317dafbed965a7d16dd5bb7e8c379760973b3e2f3"} Jan 29 04:19:14 crc kubenswrapper[4707]: I0129 04:19:14.388607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerStarted","Data":"c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff"} Jan 29 04:19:14 crc kubenswrapper[4707]: I0129 04:19:14.454309 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.811964336 podStartE2EDuration="5.454286976s" podCreationTimestamp="2026-01-29 04:19:09 +0000 UTC" firstStartedPulling="2026-01-29 04:19:10.219563277 +0000 UTC m=+3103.703792182" lastFinishedPulling="2026-01-29 04:19:12.861885917 +0000 UTC m=+3106.346114822" observedRunningTime="2026-01-29 04:19:14.440037798 +0000 UTC m=+3107.924266703" watchObservedRunningTime="2026-01-29 04:19:14.454286976 +0000 UTC m=+3107.938515881" Jan 29 04:19:19 crc kubenswrapper[4707]: I0129 04:19:19.244345 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:19:19 crc kubenswrapper[4707]: E0129 04:19:19.246395 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:19:23 crc kubenswrapper[4707]: I0129 04:19:23.490788 4707 generic.go:334] "Generic (PLEG): container finished" podID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerID="a6f6c2d8da4761b1ff682d9317dafbed965a7d16dd5bb7e8c379760973b3e2f3" exitCode=0 Jan 29 04:19:23 crc kubenswrapper[4707]: I0129 04:19:23.490920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerDied","Data":"a6f6c2d8da4761b1ff682d9317dafbed965a7d16dd5bb7e8c379760973b3e2f3"} Jan 29 04:19:24 crc kubenswrapper[4707]: I0129 04:19:24.509998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerStarted","Data":"fe595dcf1d71d62b70413d778def635fc8db790fd6af7d2c720dd6b18422688c"} Jan 29 04:19:28 crc kubenswrapper[4707]: I0129 04:19:28.557901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerStarted","Data":"a66eb1e08a290c16664923c27f4360dffa5dd330c7bce17f9655d0d2791fd9ab"} Jan 29 04:19:28 crc kubenswrapper[4707]: I0129 04:19:28.558794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerStarted","Data":"e4b6b3d4affd2f277943c960510543a32174da2242c26fea3f8cff1f3c3ac375"} Jan 29 04:19:28 crc kubenswrapper[4707]: I0129 04:19:28.615489 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.615454432 podStartE2EDuration="19.615454432s" podCreationTimestamp="2026-01-29 04:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:19:28.593105279 +0000 UTC m=+3122.077334194" watchObservedRunningTime="2026-01-29 04:19:28.615454432 +0000 UTC m=+3122.099683397" Jan 29 04:19:30 crc kubenswrapper[4707]: I0129 04:19:30.115704 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:32 crc kubenswrapper[4707]: I0129 04:19:32.244322 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:19:32 crc kubenswrapper[4707]: E0129 04:19:32.246263 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:19:40 crc kubenswrapper[4707]: I0129 04:19:40.115528 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:40 crc kubenswrapper[4707]: I0129 04:19:40.124569 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:40 crc kubenswrapper[4707]: I0129 04:19:40.729331 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.030594 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pqkc5"] Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.033424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.058282 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqkc5"] Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.117946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khh7t\" (UniqueName: \"kubernetes.io/projected/1bfeb405-aec7-493e-8aee-d33d7e1fb900-kube-api-access-khh7t\") pod \"redhat-marketplace-pqkc5\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.118007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-catalog-content\") pod \"redhat-marketplace-pqkc5\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.118065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-utilities\") pod \"redhat-marketplace-pqkc5\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.219900 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khh7t\" (UniqueName: \"kubernetes.io/projected/1bfeb405-aec7-493e-8aee-d33d7e1fb900-kube-api-access-khh7t\") pod \"redhat-marketplace-pqkc5\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.219991 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-catalog-content\") pod \"redhat-marketplace-pqkc5\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.220081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-utilities\") pod \"redhat-marketplace-pqkc5\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.220675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-catalog-content\") pod \"redhat-marketplace-pqkc5\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.220715 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-utilities\") pod \"redhat-marketplace-pqkc5\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.242802 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khh7t\" (UniqueName: \"kubernetes.io/projected/1bfeb405-aec7-493e-8aee-d33d7e1fb900-kube-api-access-khh7t\") pod \"redhat-marketplace-pqkc5\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:43 crc kubenswrapper[4707]: I0129 04:19:43.372913 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:44 crc kubenswrapper[4707]: I0129 04:19:44.020198 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqkc5"] Jan 29 04:19:44 crc kubenswrapper[4707]: I0129 04:19:44.247273 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:19:44 crc kubenswrapper[4707]: E0129 04:19:44.247735 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:19:44 crc kubenswrapper[4707]: I0129 04:19:44.795177 4707 generic.go:334] "Generic (PLEG): container finished" podID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerID="7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de" exitCode=0 Jan 29 04:19:44 crc kubenswrapper[4707]: I0129 04:19:44.795321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqkc5" event={"ID":"1bfeb405-aec7-493e-8aee-d33d7e1fb900","Type":"ContainerDied","Data":"7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de"} Jan 29 04:19:44 crc kubenswrapper[4707]: I0129 04:19:44.795722 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqkc5" event={"ID":"1bfeb405-aec7-493e-8aee-d33d7e1fb900","Type":"ContainerStarted","Data":"e9532da551e870683fbca8be0d2b5a121106d97647b9befdb465c13ab24bffa3"} Jan 29 04:19:45 crc kubenswrapper[4707]: I0129 04:19:45.812580 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqkc5" event={"ID":"1bfeb405-aec7-493e-8aee-d33d7e1fb900","Type":"ContainerStarted","Data":"78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f"} Jan 29 04:19:47 crc kubenswrapper[4707]: I0129 04:19:47.916904 4707 generic.go:334] "Generic (PLEG): container finished" podID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerID="78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f" exitCode=0 Jan 29 04:19:47 crc kubenswrapper[4707]: I0129 04:19:47.917273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqkc5" event={"ID":"1bfeb405-aec7-493e-8aee-d33d7e1fb900","Type":"ContainerDied","Data":"78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f"} Jan 29 04:19:48 crc kubenswrapper[4707]: I0129 04:19:48.937383 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqkc5" event={"ID":"1bfeb405-aec7-493e-8aee-d33d7e1fb900","Type":"ContainerStarted","Data":"cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377"} Jan 29 04:19:48 crc kubenswrapper[4707]: I0129 04:19:48.970591 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pqkc5" podStartSLOduration=2.46585647 podStartE2EDuration="5.970573856s" podCreationTimestamp="2026-01-29 04:19:43 +0000 UTC" firstStartedPulling="2026-01-29 04:19:44.798743162 +0000 UTC m=+3138.282972067" lastFinishedPulling="2026-01-29 04:19:48.303460548 +0000 UTC m=+3141.787689453" observedRunningTime="2026-01-29 04:19:48.968743905 +0000 UTC m=+3142.452972850" watchObservedRunningTime="2026-01-29 04:19:48.970573856 +0000 UTC m=+3142.454802761" Jan 29 04:19:53 crc kubenswrapper[4707]: I0129 04:19:53.373884 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:53 crc kubenswrapper[4707]: I0129 04:19:53.375147 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:53 crc kubenswrapper[4707]: I0129 04:19:53.480258 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:54 crc kubenswrapper[4707]: I0129 04:19:54.056779 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:54 crc kubenswrapper[4707]: I0129 04:19:54.127998 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqkc5"] Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.036684 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pqkc5" podUID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerName="registry-server" containerID="cri-o://cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377" gracePeriod=2 Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.674135 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.805887 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khh7t\" (UniqueName: \"kubernetes.io/projected/1bfeb405-aec7-493e-8aee-d33d7e1fb900-kube-api-access-khh7t\") pod \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.806824 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-catalog-content\") pod \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.807059 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-utilities\") pod \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\" (UID: \"1bfeb405-aec7-493e-8aee-d33d7e1fb900\") " Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.807964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-utilities" (OuterVolumeSpecName: "utilities") pod "1bfeb405-aec7-493e-8aee-d33d7e1fb900" (UID: "1bfeb405-aec7-493e-8aee-d33d7e1fb900"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.808932 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.817364 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bfeb405-aec7-493e-8aee-d33d7e1fb900-kube-api-access-khh7t" (OuterVolumeSpecName: "kube-api-access-khh7t") pod "1bfeb405-aec7-493e-8aee-d33d7e1fb900" (UID: "1bfeb405-aec7-493e-8aee-d33d7e1fb900"). InnerVolumeSpecName "kube-api-access-khh7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.842954 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1bfeb405-aec7-493e-8aee-d33d7e1fb900" (UID: "1bfeb405-aec7-493e-8aee-d33d7e1fb900"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.918763 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khh7t\" (UniqueName: \"kubernetes.io/projected/1bfeb405-aec7-493e-8aee-d33d7e1fb900-kube-api-access-khh7t\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:56 crc kubenswrapper[4707]: I0129 04:19:56.918844 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bfeb405-aec7-493e-8aee-d33d7e1fb900-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.051847 4707 generic.go:334] "Generic (PLEG): container finished" podID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerID="cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377" exitCode=0 Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.051916 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqkc5" event={"ID":"1bfeb405-aec7-493e-8aee-d33d7e1fb900","Type":"ContainerDied","Data":"cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377"} Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.052850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqkc5" event={"ID":"1bfeb405-aec7-493e-8aee-d33d7e1fb900","Type":"ContainerDied","Data":"e9532da551e870683fbca8be0d2b5a121106d97647b9befdb465c13ab24bffa3"} Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.052889 4707 scope.go:117] "RemoveContainer" containerID="cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.051985 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqkc5" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.102396 4707 scope.go:117] "RemoveContainer" containerID="78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.104886 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqkc5"] Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.115726 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqkc5"] Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.133342 4707 scope.go:117] "RemoveContainer" containerID="7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.191983 4707 scope.go:117] "RemoveContainer" containerID="cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377" Jan 29 04:19:57 crc kubenswrapper[4707]: E0129 04:19:57.195300 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377\": container with ID starting with cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377 not found: ID does not exist" containerID="cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.195371 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377"} err="failed to get container status \"cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377\": rpc error: code = NotFound desc = could not find container \"cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377\": container with ID starting with cc32b77e781c4b89ea93d08bb31d2475f5d99f3b74906362d9d8a04b358eb377 not found: ID does not exist" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.195411 4707 scope.go:117] "RemoveContainer" containerID="78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f" Jan 29 04:19:57 crc kubenswrapper[4707]: E0129 04:19:57.196009 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f\": container with ID starting with 78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f not found: ID does not exist" containerID="78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.196076 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f"} err="failed to get container status \"78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f\": rpc error: code = NotFound desc = could not find container \"78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f\": container with ID starting with 78e6d682e705fa6c0280eb302ab95c1cd2249b41b786716ed61dbc8e0521486f not found: ID does not exist" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.196122 4707 scope.go:117] "RemoveContainer" containerID="7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de" Jan 29 04:19:57 crc kubenswrapper[4707]: E0129 04:19:57.196471 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de\": container with ID starting with 7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de not found: ID does not exist" containerID="7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.196511 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de"} err="failed to get container status \"7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de\": rpc error: code = NotFound desc = could not find container \"7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de\": container with ID starting with 7e30aa06e29a55812db5526816968b3eb0b505fc265fd9597c39aa13e4c7d9de not found: ID does not exist" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.252809 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:19:57 crc kubenswrapper[4707]: E0129 04:19:57.253180 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:19:57 crc kubenswrapper[4707]: I0129 04:19:57.259808 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" path="/var/lib/kubelet/pods/1bfeb405-aec7-493e-8aee-d33d7e1fb900/volumes" Jan 29 04:20:10 crc kubenswrapper[4707]: I0129 04:20:10.244201 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:20:10 crc kubenswrapper[4707]: E0129 04:20:10.245800 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:20:21 crc kubenswrapper[4707]: I0129 04:20:21.244031 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:20:21 crc kubenswrapper[4707]: E0129 04:20:21.245075 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:20:35 crc kubenswrapper[4707]: I0129 04:20:35.244884 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:20:35 crc kubenswrapper[4707]: E0129 04:20:35.246060 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:20:40 crc kubenswrapper[4707]: I0129 04:20:40.469681 4707 scope.go:117] "RemoveContainer" containerID="075cc51254ac7c32ac05d354a2cafb5d2ec7ff354c438b5ed6e04d04600fa134" Jan 29 04:20:40 crc kubenswrapper[4707]: I0129 04:20:40.506629 4707 scope.go:117] "RemoveContainer" containerID="5c54d45b4f9f23f0903408461c273b58b53221372e63c415464039f5821180b0" Jan 29 04:20:40 crc kubenswrapper[4707]: I0129 04:20:40.535509 4707 scope.go:117] "RemoveContainer" containerID="aa7a834f4604e4c08243fedba3960346fbcefaaf36952fbe23f5a9d511cc9915" Jan 29 04:20:40 crc kubenswrapper[4707]: I0129 04:20:40.568132 4707 scope.go:117] "RemoveContainer" containerID="880cd035c4c9f5c897c9e4732f15afafb230b08d45ce385d0175f754dfa2fd3f" Jan 29 04:20:47 crc kubenswrapper[4707]: I0129 04:20:47.251683 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:20:47 crc kubenswrapper[4707]: E0129 04:20:47.252406 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:21:00 crc kubenswrapper[4707]: I0129 04:21:00.243603 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:21:00 crc kubenswrapper[4707]: E0129 04:21:00.244706 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:21:09 crc kubenswrapper[4707]: I0129 04:21:09.280981 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7886d5cc69-w8rzq_0a32b73c-f66f-425f-81a9-ef1cc36041d4/manager/0.log" Jan 29 04:21:12 crc kubenswrapper[4707]: I0129 04:21:12.244883 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:21:12 crc kubenswrapper[4707]: I0129 04:21:12.310169 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:21:12 crc kubenswrapper[4707]: I0129 04:21:12.310487 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="prometheus" containerID="cri-o://fe595dcf1d71d62b70413d778def635fc8db790fd6af7d2c720dd6b18422688c" gracePeriod=600 Jan 29 04:21:12 crc kubenswrapper[4707]: I0129 04:21:12.310675 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="thanos-sidecar" containerID="cri-o://a66eb1e08a290c16664923c27f4360dffa5dd330c7bce17f9655d0d2791fd9ab" gracePeriod=600 Jan 29 04:21:12 crc kubenswrapper[4707]: I0129 04:21:12.310712 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="config-reloader" containerID="cri-o://e4b6b3d4affd2f277943c960510543a32174da2242c26fea3f8cff1f3c3ac375" gracePeriod=600 Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.057921 4707 generic.go:334] "Generic (PLEG): container finished" podID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerID="a66eb1e08a290c16664923c27f4360dffa5dd330c7bce17f9655d0d2791fd9ab" exitCode=0 Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.057981 4707 generic.go:334] "Generic (PLEG): container finished" podID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerID="e4b6b3d4affd2f277943c960510543a32174da2242c26fea3f8cff1f3c3ac375" exitCode=0 Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.058000 4707 generic.go:334] "Generic (PLEG): container finished" podID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerID="fe595dcf1d71d62b70413d778def635fc8db790fd6af7d2c720dd6b18422688c" exitCode=0 Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.058055 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerDied","Data":"a66eb1e08a290c16664923c27f4360dffa5dd330c7bce17f9655d0d2791fd9ab"} Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.058174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerDied","Data":"e4b6b3d4affd2f277943c960510543a32174da2242c26fea3f8cff1f3c3ac375"} Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.058208 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerDied","Data":"fe595dcf1d71d62b70413d778def635fc8db790fd6af7d2c720dd6b18422688c"} Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.061457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"3abdad2d3cfc260430421335c82eca649c20a852667da0e2e621e99916452489"} Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.368682 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.456290 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-tls-assets\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.456409 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-1\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.456504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-2\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.456577 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.456633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config-out\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.456789 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-secret-combined-ca-bundle\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.456881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-thanos-prometheus-http-client-file\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.456910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.456978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.457007 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgssg\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-kube-api-access-lgssg\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.457101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-0\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.457131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.457204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\" (UID: \"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79\") " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.458293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.458492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.459363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.468837 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.468892 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-kube-api-access-lgssg" (OuterVolumeSpecName: "kube-api-access-lgssg") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "kube-api-access-lgssg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.468869 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.470494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.471438 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config" (OuterVolumeSpecName: "config") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.472844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.475249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.491398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.494877 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config-out" (OuterVolumeSpecName: "config-out") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560338 4707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560408 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560426 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560443 4707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config-out\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560459 4707 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560476 4707 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560516 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560533 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560580 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgssg\" (UniqueName: \"kubernetes.io/projected/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-kube-api-access-lgssg\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560599 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560614 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-config\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.560629 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.571258 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config" (OuterVolumeSpecName: "web-config") pod "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" (UID: "c8ea8c44-a3ed-4acb-acff-a2fa113e1d79"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.589399 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.663118 4707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79-web-config\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:13 crc kubenswrapper[4707]: I0129 04:21:13.663159 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 29 04:21:14 crc kubenswrapper[4707]: I0129 04:21:14.080862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c8ea8c44-a3ed-4acb-acff-a2fa113e1d79","Type":"ContainerDied","Data":"0bfded254f8d63cdea78997e5d7b6d588a529f5d0f86c32c61bc31a7ec0e5fa6"} Jan 29 04:21:14 crc kubenswrapper[4707]: I0129 04:21:14.080990 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:14 crc kubenswrapper[4707]: I0129 04:21:14.081330 4707 scope.go:117] "RemoveContainer" containerID="a66eb1e08a290c16664923c27f4360dffa5dd330c7bce17f9655d0d2791fd9ab" Jan 29 04:21:14 crc kubenswrapper[4707]: I0129 04:21:14.129745 4707 scope.go:117] "RemoveContainer" containerID="e4b6b3d4affd2f277943c960510543a32174da2242c26fea3f8cff1f3c3ac375" Jan 29 04:21:14 crc kubenswrapper[4707]: I0129 04:21:14.155791 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:21:14 crc kubenswrapper[4707]: I0129 04:21:14.166914 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:21:14 crc kubenswrapper[4707]: I0129 04:21:14.188815 4707 scope.go:117] "RemoveContainer" containerID="fe595dcf1d71d62b70413d778def635fc8db790fd6af7d2c720dd6b18422688c" Jan 29 04:21:14 crc kubenswrapper[4707]: I0129 04:21:14.230931 4707 scope.go:117] "RemoveContainer" containerID="a6f6c2d8da4761b1ff682d9317dafbed965a7d16dd5bb7e8c379760973b3e2f3" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.260946 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" path="/var/lib/kubelet/pods/c8ea8c44-a3ed-4acb-acff-a2fa113e1d79/volumes" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.262719 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:21:15 crc kubenswrapper[4707]: E0129 04:21:15.263127 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerName="extract-utilities" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263141 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerName="extract-utilities" Jan 29 04:21:15 crc kubenswrapper[4707]: E0129 04:21:15.263159 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="prometheus" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263166 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="prometheus" Jan 29 04:21:15 crc kubenswrapper[4707]: E0129 04:21:15.263185 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="init-config-reloader" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263194 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="init-config-reloader" Jan 29 04:21:15 crc kubenswrapper[4707]: E0129 04:21:15.263207 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerName="registry-server" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263214 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerName="registry-server" Jan 29 04:21:15 crc kubenswrapper[4707]: E0129 04:21:15.263229 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="thanos-sidecar" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263235 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="thanos-sidecar" Jan 29 04:21:15 crc kubenswrapper[4707]: E0129 04:21:15.263260 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="config-reloader" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263268 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="config-reloader" Jan 29 04:21:15 crc kubenswrapper[4707]: E0129 04:21:15.263284 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerName="extract-content" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263291 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerName="extract-content" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263518 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="prometheus" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263557 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bfeb405-aec7-493e-8aee-d33d7e1fb900" containerName="registry-server" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263574 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="thanos-sidecar" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.263587 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ea8c44-a3ed-4acb-acff-a2fa113e1d79" containerName="config-reloader" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.266641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.266449 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.271289 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.271494 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.271617 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.271622 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.271833 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.271963 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.272058 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.272015 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-k8x5b" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.284138 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410271 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5njg\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-kube-api-access-h5njg\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-config\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410919 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.410981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.411049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.411078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.411163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513815 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5njg\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-kube-api-access-h5njg\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513922 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-config\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.513981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.515206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.516150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.516369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.517299 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.521399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.522174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.522954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-config\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.525037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.525404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.525646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.525690 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.530051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.545358 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5njg\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-kube-api-access-h5njg\") pod \"prometheus-metric-storage-0\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:15 crc kubenswrapper[4707]: I0129 04:21:15.592076 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:16 crc kubenswrapper[4707]: W0129 04:21:16.365677 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59879132_88c3_47df_97e0_17d51326d313.slice/crio-4bef02129057aa3088ef18c250de3955e550d8c0f989116026d2e9062ef2c348 WatchSource:0}: Error finding container 4bef02129057aa3088ef18c250de3955e550d8c0f989116026d2e9062ef2c348: Status 404 returned error can't find the container with id 4bef02129057aa3088ef18c250de3955e550d8c0f989116026d2e9062ef2c348 Jan 29 04:21:16 crc kubenswrapper[4707]: I0129 04:21:16.370450 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:21:17 crc kubenswrapper[4707]: I0129 04:21:17.127591 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerStarted","Data":"4bef02129057aa3088ef18c250de3955e550d8c0f989116026d2e9062ef2c348"} Jan 29 04:21:22 crc kubenswrapper[4707]: I0129 04:21:22.200135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerStarted","Data":"226ace85b1500ff22422028c24a0dc45a7643f5047bafb12b5e02f6528f117b4"} Jan 29 04:21:32 crc kubenswrapper[4707]: I0129 04:21:32.365214 4707 generic.go:334] "Generic (PLEG): container finished" podID="59879132-88c3-47df-97e0-17d51326d313" containerID="226ace85b1500ff22422028c24a0dc45a7643f5047bafb12b5e02f6528f117b4" exitCode=0 Jan 29 04:21:32 crc kubenswrapper[4707]: I0129 04:21:32.365341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerDied","Data":"226ace85b1500ff22422028c24a0dc45a7643f5047bafb12b5e02f6528f117b4"} Jan 29 04:21:33 crc kubenswrapper[4707]: I0129 04:21:33.382008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerStarted","Data":"2af0aad06c6108a37a3daefc3c0e45e21c0511b3eeab974b4189d3168d954400"} Jan 29 04:21:38 crc kubenswrapper[4707]: I0129 04:21:38.470723 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerStarted","Data":"63a83181087476bd3335e0d653edbccdeb4e07fbe1039741e14489b44c0f875e"} Jan 29 04:21:38 crc kubenswrapper[4707]: I0129 04:21:38.471850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerStarted","Data":"986bc585e35d6dc652e08b265d8f2e26005bd358b1fe3bd7d75a1fcc4b931124"} Jan 29 04:21:38 crc kubenswrapper[4707]: I0129 04:21:38.536293 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.536257276 podStartE2EDuration="23.536257276s" podCreationTimestamp="2026-01-29 04:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:21:38.512177014 +0000 UTC m=+3251.996405949" watchObservedRunningTime="2026-01-29 04:21:38.536257276 +0000 UTC m=+3252.020486231" Jan 29 04:21:40 crc kubenswrapper[4707]: I0129 04:21:40.592697 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:45 crc kubenswrapper[4707]: I0129 04:21:45.593037 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:45 crc kubenswrapper[4707]: I0129 04:21:45.609301 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 29 04:21:46 crc kubenswrapper[4707]: I0129 04:21:46.598755 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 29 04:22:03 crc kubenswrapper[4707]: I0129 04:22:03.811392 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t7kgh"] Jan 29 04:22:03 crc kubenswrapper[4707]: I0129 04:22:03.817074 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:03 crc kubenswrapper[4707]: I0129 04:22:03.827583 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t7kgh"] Jan 29 04:22:03 crc kubenswrapper[4707]: I0129 04:22:03.926969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvvlv\" (UniqueName: \"kubernetes.io/projected/d98a64df-84e2-4c43-91dd-3558698fc9c1-kube-api-access-hvvlv\") pod \"redhat-operators-t7kgh\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:03 crc kubenswrapper[4707]: I0129 04:22:03.927365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-catalog-content\") pod \"redhat-operators-t7kgh\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:03 crc kubenswrapper[4707]: I0129 04:22:03.927471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-utilities\") pod \"redhat-operators-t7kgh\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:04 crc kubenswrapper[4707]: I0129 04:22:04.031044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvvlv\" (UniqueName: \"kubernetes.io/projected/d98a64df-84e2-4c43-91dd-3558698fc9c1-kube-api-access-hvvlv\") pod \"redhat-operators-t7kgh\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:04 crc kubenswrapper[4707]: I0129 04:22:04.031272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-catalog-content\") pod \"redhat-operators-t7kgh\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:04 crc kubenswrapper[4707]: I0129 04:22:04.031318 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-utilities\") pod \"redhat-operators-t7kgh\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:04 crc kubenswrapper[4707]: I0129 04:22:04.032199 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-catalog-content\") pod \"redhat-operators-t7kgh\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:04 crc kubenswrapper[4707]: I0129 04:22:04.032231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-utilities\") pod \"redhat-operators-t7kgh\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:04 crc kubenswrapper[4707]: I0129 04:22:04.054952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvvlv\" (UniqueName: \"kubernetes.io/projected/d98a64df-84e2-4c43-91dd-3558698fc9c1-kube-api-access-hvvlv\") pod \"redhat-operators-t7kgh\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:04 crc kubenswrapper[4707]: I0129 04:22:04.151608 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:04 crc kubenswrapper[4707]: W0129 04:22:04.668898 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd98a64df_84e2_4c43_91dd_3558698fc9c1.slice/crio-1218ebda2cc44eed853aa13d1f1c438de72471a3a5d7be8a5894a56c112a18c0 WatchSource:0}: Error finding container 1218ebda2cc44eed853aa13d1f1c438de72471a3a5d7be8a5894a56c112a18c0: Status 404 returned error can't find the container with id 1218ebda2cc44eed853aa13d1f1c438de72471a3a5d7be8a5894a56c112a18c0 Jan 29 04:22:04 crc kubenswrapper[4707]: I0129 04:22:04.675889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t7kgh"] Jan 29 04:22:04 crc kubenswrapper[4707]: I0129 04:22:04.872319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7kgh" event={"ID":"d98a64df-84e2-4c43-91dd-3558698fc9c1","Type":"ContainerStarted","Data":"1218ebda2cc44eed853aa13d1f1c438de72471a3a5d7be8a5894a56c112a18c0"} Jan 29 04:22:05 crc kubenswrapper[4707]: I0129 04:22:05.886304 4707 generic.go:334] "Generic (PLEG): container finished" podID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerID="a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606" exitCode=0 Jan 29 04:22:05 crc kubenswrapper[4707]: I0129 04:22:05.886390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7kgh" event={"ID":"d98a64df-84e2-4c43-91dd-3558698fc9c1","Type":"ContainerDied","Data":"a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606"} Jan 29 04:22:05 crc kubenswrapper[4707]: I0129 04:22:05.890279 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 04:22:06 crc kubenswrapper[4707]: I0129 04:22:06.908031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7kgh" event={"ID":"d98a64df-84e2-4c43-91dd-3558698fc9c1","Type":"ContainerStarted","Data":"5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1"} Jan 29 04:22:11 crc kubenswrapper[4707]: I0129 04:22:11.988079 4707 generic.go:334] "Generic (PLEG): container finished" podID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerID="5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1" exitCode=0 Jan 29 04:22:11 crc kubenswrapper[4707]: I0129 04:22:11.989442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7kgh" event={"ID":"d98a64df-84e2-4c43-91dd-3558698fc9c1","Type":"ContainerDied","Data":"5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1"} Jan 29 04:22:13 crc kubenswrapper[4707]: I0129 04:22:13.007042 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7kgh" event={"ID":"d98a64df-84e2-4c43-91dd-3558698fc9c1","Type":"ContainerStarted","Data":"71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72"} Jan 29 04:22:13 crc kubenswrapper[4707]: I0129 04:22:13.048071 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t7kgh" podStartSLOduration=3.47838278 podStartE2EDuration="10.04804028s" podCreationTimestamp="2026-01-29 04:22:03 +0000 UTC" firstStartedPulling="2026-01-29 04:22:05.890033321 +0000 UTC m=+3279.374262226" lastFinishedPulling="2026-01-29 04:22:12.459690821 +0000 UTC m=+3285.943919726" observedRunningTime="2026-01-29 04:22:13.02940914 +0000 UTC m=+3286.513638105" watchObservedRunningTime="2026-01-29 04:22:13.04804028 +0000 UTC m=+3286.532269195" Jan 29 04:22:14 crc kubenswrapper[4707]: I0129 04:22:14.152066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:14 crc kubenswrapper[4707]: I0129 04:22:14.152520 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:15 crc kubenswrapper[4707]: I0129 04:22:15.235611 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t7kgh" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerName="registry-server" probeResult="failure" output=< Jan 29 04:22:15 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 04:22:15 crc kubenswrapper[4707]: > Jan 29 04:22:24 crc kubenswrapper[4707]: I0129 04:22:24.222749 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:24 crc kubenswrapper[4707]: I0129 04:22:24.321038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:24 crc kubenswrapper[4707]: I0129 04:22:24.496404 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t7kgh"] Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.172919 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t7kgh" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerName="registry-server" containerID="cri-o://71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72" gracePeriod=2 Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.690265 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.773733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvvlv\" (UniqueName: \"kubernetes.io/projected/d98a64df-84e2-4c43-91dd-3558698fc9c1-kube-api-access-hvvlv\") pod \"d98a64df-84e2-4c43-91dd-3558698fc9c1\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.773785 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-utilities\") pod \"d98a64df-84e2-4c43-91dd-3558698fc9c1\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.773954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-catalog-content\") pod \"d98a64df-84e2-4c43-91dd-3558698fc9c1\" (UID: \"d98a64df-84e2-4c43-91dd-3558698fc9c1\") " Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.775380 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-utilities" (OuterVolumeSpecName: "utilities") pod "d98a64df-84e2-4c43-91dd-3558698fc9c1" (UID: "d98a64df-84e2-4c43-91dd-3558698fc9c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.782568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98a64df-84e2-4c43-91dd-3558698fc9c1-kube-api-access-hvvlv" (OuterVolumeSpecName: "kube-api-access-hvvlv") pod "d98a64df-84e2-4c43-91dd-3558698fc9c1" (UID: "d98a64df-84e2-4c43-91dd-3558698fc9c1"). InnerVolumeSpecName "kube-api-access-hvvlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.877014 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvvlv\" (UniqueName: \"kubernetes.io/projected/d98a64df-84e2-4c43-91dd-3558698fc9c1-kube-api-access-hvvlv\") on node \"crc\" DevicePath \"\"" Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.877064 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.917660 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d98a64df-84e2-4c43-91dd-3558698fc9c1" (UID: "d98a64df-84e2-4c43-91dd-3558698fc9c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:22:26 crc kubenswrapper[4707]: I0129 04:22:26.979084 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98a64df-84e2-4c43-91dd-3558698fc9c1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.190093 4707 generic.go:334] "Generic (PLEG): container finished" podID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerID="71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72" exitCode=0 Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.190174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7kgh" event={"ID":"d98a64df-84e2-4c43-91dd-3558698fc9c1","Type":"ContainerDied","Data":"71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72"} Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.190242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7kgh" event={"ID":"d98a64df-84e2-4c43-91dd-3558698fc9c1","Type":"ContainerDied","Data":"1218ebda2cc44eed853aa13d1f1c438de72471a3a5d7be8a5894a56c112a18c0"} Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.190278 4707 scope.go:117] "RemoveContainer" containerID="71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.190236 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7kgh" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.225937 4707 scope.go:117] "RemoveContainer" containerID="5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.267648 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t7kgh"] Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.272838 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t7kgh"] Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.274989 4707 scope.go:117] "RemoveContainer" containerID="a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.333699 4707 scope.go:117] "RemoveContainer" containerID="71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72" Jan 29 04:22:27 crc kubenswrapper[4707]: E0129 04:22:27.334523 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72\": container with ID starting with 71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72 not found: ID does not exist" containerID="71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.334598 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72"} err="failed to get container status \"71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72\": rpc error: code = NotFound desc = could not find container \"71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72\": container with ID starting with 71919cd40ef168a665f439feeedb8b7d5cdfa1dcd19f3ea7b6dec085b805fe72 not found: ID does not exist" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.334636 4707 scope.go:117] "RemoveContainer" containerID="5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1" Jan 29 04:22:27 crc kubenswrapper[4707]: E0129 04:22:27.335217 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1\": container with ID starting with 5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1 not found: ID does not exist" containerID="5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.335290 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1"} err="failed to get container status \"5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1\": rpc error: code = NotFound desc = could not find container \"5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1\": container with ID starting with 5ca2c032e00a568ba1ddc6f260ea39ff8a4df8dc89926f336c1eeef8a7db1bd1 not found: ID does not exist" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.335328 4707 scope.go:117] "RemoveContainer" containerID="a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606" Jan 29 04:22:27 crc kubenswrapper[4707]: E0129 04:22:27.335891 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606\": container with ID starting with a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606 not found: ID does not exist" containerID="a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606" Jan 29 04:22:27 crc kubenswrapper[4707]: I0129 04:22:27.336457 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606"} err="failed to get container status \"a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606\": rpc error: code = NotFound desc = could not find container \"a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606\": container with ID starting with a597416aead86bb0920d0c8e0234792b5b9c8df8ba7f9bbb1c1e331221e04606 not found: ID does not exist" Jan 29 04:22:29 crc kubenswrapper[4707]: I0129 04:22:29.262602 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" path="/var/lib/kubelet/pods/d98a64df-84e2-4c43-91dd-3558698fc9c1/volumes" Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.836762 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j742g"] Jan 29 04:22:33 crc kubenswrapper[4707]: E0129 04:22:33.840386 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerName="extract-content" Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.840416 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerName="extract-content" Jan 29 04:22:33 crc kubenswrapper[4707]: E0129 04:22:33.840465 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerName="registry-server" Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.840477 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerName="registry-server" Jan 29 04:22:33 crc kubenswrapper[4707]: E0129 04:22:33.840489 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerName="extract-utilities" Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.840498 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerName="extract-utilities" Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.840756 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98a64df-84e2-4c43-91dd-3558698fc9c1" containerName="registry-server" Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.842919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.879209 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j742g"] Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.918397 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-utilities\") pod \"community-operators-j742g\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.918937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jswn\" (UniqueName: \"kubernetes.io/projected/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-kube-api-access-7jswn\") pod \"community-operators-j742g\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:33 crc kubenswrapper[4707]: I0129 04:22:33.919118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-catalog-content\") pod \"community-operators-j742g\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:34 crc kubenswrapper[4707]: I0129 04:22:34.021558 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-utilities\") pod \"community-operators-j742g\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:34 crc kubenswrapper[4707]: I0129 04:22:34.021683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jswn\" (UniqueName: \"kubernetes.io/projected/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-kube-api-access-7jswn\") pod \"community-operators-j742g\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:34 crc kubenswrapper[4707]: I0129 04:22:34.021716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-catalog-content\") pod \"community-operators-j742g\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:34 crc kubenswrapper[4707]: I0129 04:22:34.022347 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-catalog-content\") pod \"community-operators-j742g\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:34 crc kubenswrapper[4707]: I0129 04:22:34.022364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-utilities\") pod \"community-operators-j742g\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:34 crc kubenswrapper[4707]: I0129 04:22:34.056669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jswn\" (UniqueName: \"kubernetes.io/projected/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-kube-api-access-7jswn\") pod \"community-operators-j742g\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:34 crc kubenswrapper[4707]: I0129 04:22:34.170403 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:34 crc kubenswrapper[4707]: I0129 04:22:34.752783 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j742g"] Jan 29 04:22:35 crc kubenswrapper[4707]: I0129 04:22:35.298600 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerID="c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d" exitCode=0 Jan 29 04:22:35 crc kubenswrapper[4707]: I0129 04:22:35.298678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j742g" event={"ID":"5f84d398-2a6c-41f3-9fa4-bff11e3317cf","Type":"ContainerDied","Data":"c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d"} Jan 29 04:22:35 crc kubenswrapper[4707]: I0129 04:22:35.298712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j742g" event={"ID":"5f84d398-2a6c-41f3-9fa4-bff11e3317cf","Type":"ContainerStarted","Data":"92c99e618bef1e4c2757cda4c169f06972d5708a61319858b2d112d95412ff8e"} Jan 29 04:22:37 crc kubenswrapper[4707]: I0129 04:22:37.413031 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j742g" event={"ID":"5f84d398-2a6c-41f3-9fa4-bff11e3317cf","Type":"ContainerStarted","Data":"253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf"} Jan 29 04:22:38 crc kubenswrapper[4707]: I0129 04:22:38.430393 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerID="253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf" exitCode=0 Jan 29 04:22:38 crc kubenswrapper[4707]: I0129 04:22:38.430589 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j742g" event={"ID":"5f84d398-2a6c-41f3-9fa4-bff11e3317cf","Type":"ContainerDied","Data":"253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf"} Jan 29 04:22:39 crc kubenswrapper[4707]: I0129 04:22:39.454235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j742g" event={"ID":"5f84d398-2a6c-41f3-9fa4-bff11e3317cf","Type":"ContainerStarted","Data":"aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80"} Jan 29 04:22:39 crc kubenswrapper[4707]: I0129 04:22:39.483890 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j742g" podStartSLOduration=2.876127674 podStartE2EDuration="6.48386169s" podCreationTimestamp="2026-01-29 04:22:33 +0000 UTC" firstStartedPulling="2026-01-29 04:22:35.301396505 +0000 UTC m=+3308.785625410" lastFinishedPulling="2026-01-29 04:22:38.909130521 +0000 UTC m=+3312.393359426" observedRunningTime="2026-01-29 04:22:39.47530902 +0000 UTC m=+3312.959537935" watchObservedRunningTime="2026-01-29 04:22:39.48386169 +0000 UTC m=+3312.968090615" Jan 29 04:22:44 crc kubenswrapper[4707]: I0129 04:22:44.170849 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:44 crc kubenswrapper[4707]: I0129 04:22:44.171862 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:44 crc kubenswrapper[4707]: I0129 04:22:44.225097 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:44 crc kubenswrapper[4707]: I0129 04:22:44.599187 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:44 crc kubenswrapper[4707]: I0129 04:22:44.663114 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j742g"] Jan 29 04:22:46 crc kubenswrapper[4707]: I0129 04:22:46.558444 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j742g" podUID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerName="registry-server" containerID="cri-o://aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80" gracePeriod=2 Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.141328 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.256505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-catalog-content\") pod \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.256727 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jswn\" (UniqueName: \"kubernetes.io/projected/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-kube-api-access-7jswn\") pod \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.256799 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-utilities\") pod \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\" (UID: \"5f84d398-2a6c-41f3-9fa4-bff11e3317cf\") " Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.258245 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-utilities" (OuterVolumeSpecName: "utilities") pod "5f84d398-2a6c-41f3-9fa4-bff11e3317cf" (UID: "5f84d398-2a6c-41f3-9fa4-bff11e3317cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.266724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-kube-api-access-7jswn" (OuterVolumeSpecName: "kube-api-access-7jswn") pod "5f84d398-2a6c-41f3-9fa4-bff11e3317cf" (UID: "5f84d398-2a6c-41f3-9fa4-bff11e3317cf"). InnerVolumeSpecName "kube-api-access-7jswn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.308786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f84d398-2a6c-41f3-9fa4-bff11e3317cf" (UID: "5f84d398-2a6c-41f3-9fa4-bff11e3317cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.359619 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jswn\" (UniqueName: \"kubernetes.io/projected/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-kube-api-access-7jswn\") on node \"crc\" DevicePath \"\"" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.359656 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.359671 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f84d398-2a6c-41f3-9fa4-bff11e3317cf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.571988 4707 generic.go:334] "Generic (PLEG): container finished" podID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerID="aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80" exitCode=0 Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.572125 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j742g" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.572121 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j742g" event={"ID":"5f84d398-2a6c-41f3-9fa4-bff11e3317cf","Type":"ContainerDied","Data":"aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80"} Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.572549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j742g" event={"ID":"5f84d398-2a6c-41f3-9fa4-bff11e3317cf","Type":"ContainerDied","Data":"92c99e618bef1e4c2757cda4c169f06972d5708a61319858b2d112d95412ff8e"} Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.572580 4707 scope.go:117] "RemoveContainer" containerID="aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.604787 4707 scope.go:117] "RemoveContainer" containerID="253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.615318 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j742g"] Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.626156 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j742g"] Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.633269 4707 scope.go:117] "RemoveContainer" containerID="c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.692700 4707 scope.go:117] "RemoveContainer" containerID="aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80" Jan 29 04:22:47 crc kubenswrapper[4707]: E0129 04:22:47.693318 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80\": container with ID starting with aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80 not found: ID does not exist" containerID="aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.693376 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80"} err="failed to get container status \"aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80\": rpc error: code = NotFound desc = could not find container \"aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80\": container with ID starting with aed4dc886c9e4aaacab45c13b69cae1136a1d9290530448d46baedc611642b80 not found: ID does not exist" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.693410 4707 scope.go:117] "RemoveContainer" containerID="253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf" Jan 29 04:22:47 crc kubenswrapper[4707]: E0129 04:22:47.694037 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf\": container with ID starting with 253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf not found: ID does not exist" containerID="253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.694163 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf"} err="failed to get container status \"253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf\": rpc error: code = NotFound desc = could not find container \"253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf\": container with ID starting with 253301d9a5c171722c577236cd561c12839553991e9bccde8a90579dc70421bf not found: ID does not exist" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.694261 4707 scope.go:117] "RemoveContainer" containerID="c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d" Jan 29 04:22:47 crc kubenswrapper[4707]: E0129 04:22:47.695856 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d\": container with ID starting with c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d not found: ID does not exist" containerID="c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d" Jan 29 04:22:47 crc kubenswrapper[4707]: I0129 04:22:47.695912 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d"} err="failed to get container status \"c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d\": rpc error: code = NotFound desc = could not find container \"c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d\": container with ID starting with c758d216b6aed750cf7dc74521dec006d2275f12481afa4d2d9ec8910e920a0d not found: ID does not exist" Jan 29 04:22:49 crc kubenswrapper[4707]: I0129 04:22:49.259585 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" path="/var/lib/kubelet/pods/5f84d398-2a6c-41f3-9fa4-bff11e3317cf/volumes" Jan 29 04:23:11 crc kubenswrapper[4707]: I0129 04:23:11.353619 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7886d5cc69-w8rzq_0a32b73c-f66f-425f-81a9-ef1cc36041d4/manager/0.log" Jan 29 04:23:13 crc kubenswrapper[4707]: I0129 04:23:13.088458 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 29 04:23:13 crc kubenswrapper[4707]: I0129 04:23:13.089314 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-api" containerID="cri-o://a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498" gracePeriod=30 Jan 29 04:23:13 crc kubenswrapper[4707]: I0129 04:23:13.089376 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-listener" containerID="cri-o://c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff" gracePeriod=30 Jan 29 04:23:13 crc kubenswrapper[4707]: I0129 04:23:13.089393 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-evaluator" containerID="cri-o://1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22" gracePeriod=30 Jan 29 04:23:13 crc kubenswrapper[4707]: I0129 04:23:13.089423 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-notifier" containerID="cri-o://4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a" gracePeriod=30 Jan 29 04:23:13 crc kubenswrapper[4707]: I0129 04:23:13.927816 4707 generic.go:334] "Generic (PLEG): container finished" podID="701e271e-81d3-4a93-a724-761ec5a242f6" containerID="1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22" exitCode=0 Jan 29 04:23:13 crc kubenswrapper[4707]: I0129 04:23:13.928227 4707 generic.go:334] "Generic (PLEG): container finished" podID="701e271e-81d3-4a93-a724-761ec5a242f6" containerID="a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498" exitCode=0 Jan 29 04:23:13 crc kubenswrapper[4707]: I0129 04:23:13.927860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerDied","Data":"1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22"} Jan 29 04:23:13 crc kubenswrapper[4707]: I0129 04:23:13.928277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerDied","Data":"a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498"} Jan 29 04:23:20 crc kubenswrapper[4707]: I0129 04:23:20.038530 4707 generic.go:334] "Generic (PLEG): container finished" podID="701e271e-81d3-4a93-a724-761ec5a242f6" containerID="4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a" exitCode=0 Jan 29 04:23:20 crc kubenswrapper[4707]: I0129 04:23:20.038579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerDied","Data":"4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a"} Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.709788 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.718169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-scripts\") pod \"701e271e-81d3-4a93-a724-761ec5a242f6\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.718306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-combined-ca-bundle\") pod \"701e271e-81d3-4a93-a724-761ec5a242f6\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.718360 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snhn9\" (UniqueName: \"kubernetes.io/projected/701e271e-81d3-4a93-a724-761ec5a242f6-kube-api-access-snhn9\") pod \"701e271e-81d3-4a93-a724-761ec5a242f6\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.718583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-public-tls-certs\") pod \"701e271e-81d3-4a93-a724-761ec5a242f6\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.718731 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-config-data\") pod \"701e271e-81d3-4a93-a724-761ec5a242f6\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.718777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-internal-tls-certs\") pod \"701e271e-81d3-4a93-a724-761ec5a242f6\" (UID: \"701e271e-81d3-4a93-a724-761ec5a242f6\") " Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.728565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701e271e-81d3-4a93-a724-761ec5a242f6-kube-api-access-snhn9" (OuterVolumeSpecName: "kube-api-access-snhn9") pod "701e271e-81d3-4a93-a724-761ec5a242f6" (UID: "701e271e-81d3-4a93-a724-761ec5a242f6"). InnerVolumeSpecName "kube-api-access-snhn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.728763 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-scripts" (OuterVolumeSpecName: "scripts") pod "701e271e-81d3-4a93-a724-761ec5a242f6" (UID: "701e271e-81d3-4a93-a724-761ec5a242f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.821437 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.821953 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snhn9\" (UniqueName: \"kubernetes.io/projected/701e271e-81d3-4a93-a724-761ec5a242f6-kube-api-access-snhn9\") on node \"crc\" DevicePath \"\"" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.840915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "701e271e-81d3-4a93-a724-761ec5a242f6" (UID: "701e271e-81d3-4a93-a724-761ec5a242f6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.845977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "701e271e-81d3-4a93-a724-761ec5a242f6" (UID: "701e271e-81d3-4a93-a724-761ec5a242f6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.882606 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "701e271e-81d3-4a93-a724-761ec5a242f6" (UID: "701e271e-81d3-4a93-a724-761ec5a242f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.922925 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.922967 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.922976 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 04:23:21 crc kubenswrapper[4707]: I0129 04:23:21.956871 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-config-data" (OuterVolumeSpecName: "config-data") pod "701e271e-81d3-4a93-a724-761ec5a242f6" (UID: "701e271e-81d3-4a93-a724-761ec5a242f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.025906 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701e271e-81d3-4a93-a724-761ec5a242f6-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.065351 4707 generic.go:334] "Generic (PLEG): container finished" podID="701e271e-81d3-4a93-a724-761ec5a242f6" containerID="c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff" exitCode=0 Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.065423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerDied","Data":"c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff"} Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.065468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"701e271e-81d3-4a93-a724-761ec5a242f6","Type":"ContainerDied","Data":"d5fe1423918f026ec492752c9f9f61962a6ca895e3dae71755ae9763172d5b51"} Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.065492 4707 scope.go:117] "RemoveContainer" containerID="c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.065531 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.100162 4707 scope.go:117] "RemoveContainer" containerID="4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.123887 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.136935 4707 scope.go:117] "RemoveContainer" containerID="1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.140952 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.161257 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.162033 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerName="extract-content" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162055 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerName="extract-content" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.162077 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-evaluator" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162084 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-evaluator" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.162098 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerName="registry-server" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162107 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerName="registry-server" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.162134 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerName="extract-utilities" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162141 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerName="extract-utilities" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.162156 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-notifier" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162162 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-notifier" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.162171 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-listener" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162176 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-listener" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.162192 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-api" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162198 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-api" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162392 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-notifier" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162407 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-api" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162425 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-evaluator" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162440 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f84d398-2a6c-41f3-9fa4-bff11e3317cf" containerName="registry-server" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.162457 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" containerName="aodh-listener" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.164179 4707 scope.go:117] "RemoveContainer" containerID="a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.164609 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.167525 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.167607 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.167859 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.169612 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-8fk7t" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.179861 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.189303 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.196986 4707 scope.go:117] "RemoveContainer" containerID="c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.198340 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff\": container with ID starting with c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff not found: ID does not exist" containerID="c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.198387 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff"} err="failed to get container status \"c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff\": rpc error: code = NotFound desc = could not find container \"c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff\": container with ID starting with c47993f8494c8e376993a3805a8db12c2fb239c998f46a72636aad6232ec23ff not found: ID does not exist" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.198417 4707 scope.go:117] "RemoveContainer" containerID="4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.198700 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a\": container with ID starting with 4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a not found: ID does not exist" containerID="4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.198725 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a"} err="failed to get container status \"4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a\": rpc error: code = NotFound desc = could not find container \"4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a\": container with ID starting with 4d885d95b8a04d5de2720b5ae88a2ccc0a28493b6d526019a83a587570c2541a not found: ID does not exist" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.198738 4707 scope.go:117] "RemoveContainer" containerID="1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.199019 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22\": container with ID starting with 1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22 not found: ID does not exist" containerID="1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.199041 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22"} err="failed to get container status \"1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22\": rpc error: code = NotFound desc = could not find container \"1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22\": container with ID starting with 1a86d704c5e46e84f6a94ea09335a831e0183e8aa2bd6f348491f5d654713d22 not found: ID does not exist" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.199052 4707 scope.go:117] "RemoveContainer" containerID="a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498" Jan 29 04:23:22 crc kubenswrapper[4707]: E0129 04:23:22.199584 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498\": container with ID starting with a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498 not found: ID does not exist" containerID="a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.199608 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498"} err="failed to get container status \"a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498\": rpc error: code = NotFound desc = could not find container \"a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498\": container with ID starting with a625e8d6635ec26325c99c3a60251e736867ab41f1172fca3d1d98c268670498 not found: ID does not exist" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.231175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.231258 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f88c\" (UniqueName: \"kubernetes.io/projected/1c46106d-ca5d-4cac-820f-cf2935abf6d8-kube-api-access-4f88c\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.231299 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-public-tls-certs\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.231365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-internal-tls-certs\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.231402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-config-data\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.231439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-scripts\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.332492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-scripts\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.332571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.332716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f88c\" (UniqueName: \"kubernetes.io/projected/1c46106d-ca5d-4cac-820f-cf2935abf6d8-kube-api-access-4f88c\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.332812 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-public-tls-certs\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.332889 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-internal-tls-certs\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.332980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-config-data\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.338906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.339680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-scripts\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.339838 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-internal-tls-certs\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.340400 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-public-tls-certs\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.341579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c46106d-ca5d-4cac-820f-cf2935abf6d8-config-data\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.359227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f88c\" (UniqueName: \"kubernetes.io/projected/1c46106d-ca5d-4cac-820f-cf2935abf6d8-kube-api-access-4f88c\") pod \"aodh-0\" (UID: \"1c46106d-ca5d-4cac-820f-cf2935abf6d8\") " pod="openstack/aodh-0" Jan 29 04:23:22 crc kubenswrapper[4707]: I0129 04:23:22.485662 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 29 04:23:23 crc kubenswrapper[4707]: I0129 04:23:23.053042 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 29 04:23:23 crc kubenswrapper[4707]: I0129 04:23:23.076775 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1c46106d-ca5d-4cac-820f-cf2935abf6d8","Type":"ContainerStarted","Data":"faf2c78d41945732e24f687535e27ce03aa0edd2ced1178b363e5204b9c76782"} Jan 29 04:23:23 crc kubenswrapper[4707]: I0129 04:23:23.254261 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701e271e-81d3-4a93-a724-761ec5a242f6" path="/var/lib/kubelet/pods/701e271e-81d3-4a93-a724-761ec5a242f6/volumes" Jan 29 04:23:24 crc kubenswrapper[4707]: I0129 04:23:24.096368 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1c46106d-ca5d-4cac-820f-cf2935abf6d8","Type":"ContainerStarted","Data":"7d3702a340c97c51c261773361fc82495db06d5394b8bf8ce05279b5ad0296ed"} Jan 29 04:23:25 crc kubenswrapper[4707]: I0129 04:23:25.110507 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1c46106d-ca5d-4cac-820f-cf2935abf6d8","Type":"ContainerStarted","Data":"01f7de3bf604e0ccac0e37d2a8f4dda96b0eac824d75ce4ff02b9b73cc85f3fe"} Jan 29 04:23:26 crc kubenswrapper[4707]: I0129 04:23:26.128053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1c46106d-ca5d-4cac-820f-cf2935abf6d8","Type":"ContainerStarted","Data":"80262a889c33b56209aca1995c5cffc6c90af1d26ad1ffcb509d57eeba7f872b"} Jan 29 04:23:27 crc kubenswrapper[4707]: I0129 04:23:27.146971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1c46106d-ca5d-4cac-820f-cf2935abf6d8","Type":"ContainerStarted","Data":"923c8017b292c2d09b68690d149ce7ed7a715c6dc7dd97abe952d55aefa1b394"} Jan 29 04:23:27 crc kubenswrapper[4707]: I0129 04:23:27.215945 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.17133281 podStartE2EDuration="5.215903452s" podCreationTimestamp="2026-01-29 04:23:22 +0000 UTC" firstStartedPulling="2026-01-29 04:23:23.060369673 +0000 UTC m=+3356.544598578" lastFinishedPulling="2026-01-29 04:23:26.104940295 +0000 UTC m=+3359.589169220" observedRunningTime="2026-01-29 04:23:27.20906349 +0000 UTC m=+3360.693292395" watchObservedRunningTime="2026-01-29 04:23:27.215903452 +0000 UTC m=+3360.700132357" Jan 29 04:23:33 crc kubenswrapper[4707]: I0129 04:23:33.463679 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:23:33 crc kubenswrapper[4707]: I0129 04:23:33.464563 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:23:37 crc kubenswrapper[4707]: I0129 04:23:37.054409 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-zm8bg"] Jan 29 04:23:37 crc kubenswrapper[4707]: I0129 04:23:37.070610 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-c6cd-account-create-update-68v6b"] Jan 29 04:23:37 crc kubenswrapper[4707]: I0129 04:23:37.082861 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-zm8bg"] Jan 29 04:23:37 crc kubenswrapper[4707]: I0129 04:23:37.091674 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-c6cd-account-create-update-68v6b"] Jan 29 04:23:37 crc kubenswrapper[4707]: I0129 04:23:37.289694 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1" path="/var/lib/kubelet/pods/5b884bf3-5ea2-4aa6-b29f-ddb49c3f94b1/volumes" Jan 29 04:23:37 crc kubenswrapper[4707]: I0129 04:23:37.290479 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d200ad2-aa0f-468d-92ac-0563af93b582" path="/var/lib/kubelet/pods/7d200ad2-aa0f-468d-92ac-0563af93b582/volumes" Jan 29 04:23:40 crc kubenswrapper[4707]: I0129 04:23:40.754982 4707 scope.go:117] "RemoveContainer" containerID="5e866acd235ea292f5d705af25827b8dab3a3533aead98e43b94854731d335bc" Jan 29 04:23:40 crc kubenswrapper[4707]: I0129 04:23:40.789945 4707 scope.go:117] "RemoveContainer" containerID="c48b6e2824a85ef22836018a375a237791d888983bf2b4628081cd0e862c3a85" Jan 29 04:23:49 crc kubenswrapper[4707]: I0129 04:23:49.047696 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-m4nxz"] Jan 29 04:23:49 crc kubenswrapper[4707]: I0129 04:23:49.058927 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-m4nxz"] Jan 29 04:23:49 crc kubenswrapper[4707]: I0129 04:23:49.265068 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b75fe528-48a1-41ae-af17-80199293c062" path="/var/lib/kubelet/pods/b75fe528-48a1-41ae-af17-80199293c062/volumes" Jan 29 04:24:03 crc kubenswrapper[4707]: I0129 04:24:03.463383 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:24:03 crc kubenswrapper[4707]: I0129 04:24:03.464323 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:24:33 crc kubenswrapper[4707]: I0129 04:24:33.473134 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:24:33 crc kubenswrapper[4707]: I0129 04:24:33.474129 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:24:33 crc kubenswrapper[4707]: I0129 04:24:33.474219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 04:24:33 crc kubenswrapper[4707]: I0129 04:24:33.476734 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3abdad2d3cfc260430421335c82eca649c20a852667da0e2e621e99916452489"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 04:24:33 crc kubenswrapper[4707]: I0129 04:24:33.476862 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://3abdad2d3cfc260430421335c82eca649c20a852667da0e2e621e99916452489" gracePeriod=600 Jan 29 04:24:34 crc kubenswrapper[4707]: I0129 04:24:34.204386 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="3abdad2d3cfc260430421335c82eca649c20a852667da0e2e621e99916452489" exitCode=0 Jan 29 04:24:34 crc kubenswrapper[4707]: I0129 04:24:34.204436 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"3abdad2d3cfc260430421335c82eca649c20a852667da0e2e621e99916452489"} Jan 29 04:24:34 crc kubenswrapper[4707]: I0129 04:24:34.205305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b"} Jan 29 04:24:34 crc kubenswrapper[4707]: I0129 04:24:34.205341 4707 scope.go:117] "RemoveContainer" containerID="5f0ce0edcfdf8d89942a5e49cc9e529d90d43201aa5cd0ffa320a1c402ce9ee7" Jan 29 04:24:40 crc kubenswrapper[4707]: I0129 04:24:40.995334 4707 scope.go:117] "RemoveContainer" containerID="51fe567c741545a7aa2afebc71201117bf1762413d32dc0db760f8c31f44fc27" Jan 29 04:24:41 crc kubenswrapper[4707]: I0129 04:24:41.037488 4707 scope.go:117] "RemoveContainer" containerID="29f41c079460f385d2185e2cb0bdd2d6fd36798fa80877f00a0ba9b35b625cf6" Jan 29 04:25:13 crc kubenswrapper[4707]: I0129 04:25:13.100508 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7886d5cc69-w8rzq_0a32b73c-f66f-425f-81a9-ef1cc36041d4/manager/0.log" Jan 29 04:25:16 crc kubenswrapper[4707]: I0129 04:25:16.870134 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:25:16 crc kubenswrapper[4707]: I0129 04:25:16.872464 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="prometheus" containerID="cri-o://2af0aad06c6108a37a3daefc3c0e45e21c0511b3eeab974b4189d3168d954400" gracePeriod=600 Jan 29 04:25:16 crc kubenswrapper[4707]: I0129 04:25:16.872579 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="thanos-sidecar" containerID="cri-o://63a83181087476bd3335e0d653edbccdeb4e07fbe1039741e14489b44c0f875e" gracePeriod=600 Jan 29 04:25:16 crc kubenswrapper[4707]: I0129 04:25:16.872627 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="config-reloader" containerID="cri-o://986bc585e35d6dc652e08b265d8f2e26005bd358b1fe3bd7d75a1fcc4b931124" gracePeriod=600 Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.788800 4707 generic.go:334] "Generic (PLEG): container finished" podID="59879132-88c3-47df-97e0-17d51326d313" containerID="63a83181087476bd3335e0d653edbccdeb4e07fbe1039741e14489b44c0f875e" exitCode=0 Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.789873 4707 generic.go:334] "Generic (PLEG): container finished" podID="59879132-88c3-47df-97e0-17d51326d313" containerID="986bc585e35d6dc652e08b265d8f2e26005bd358b1fe3bd7d75a1fcc4b931124" exitCode=0 Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.789885 4707 generic.go:334] "Generic (PLEG): container finished" podID="59879132-88c3-47df-97e0-17d51326d313" containerID="2af0aad06c6108a37a3daefc3c0e45e21c0511b3eeab974b4189d3168d954400" exitCode=0 Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.789910 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerDied","Data":"63a83181087476bd3335e0d653edbccdeb4e07fbe1039741e14489b44c0f875e"} Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.789953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerDied","Data":"986bc585e35d6dc652e08b265d8f2e26005bd358b1fe3bd7d75a1fcc4b931124"} Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.789965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerDied","Data":"2af0aad06c6108a37a3daefc3c0e45e21c0511b3eeab974b4189d3168d954400"} Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.929451 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945065 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-tls-assets\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945233 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5njg\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-kube-api-access-h5njg\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945284 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-secret-combined-ca-bundle\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945367 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-0\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945479 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-1\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-thanos-prometheus-http-client-file\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-db\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-config\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945625 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-2\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.945656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-config-out\") pod \"59879132-88c3-47df-97e0-17d51326d313\" (UID: \"59879132-88c3-47df-97e0-17d51326d313\") " Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.946699 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.949921 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.950851 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-db" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "prometheus-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.950950 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.954737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.954746 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.956340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-config" (OuterVolumeSpecName: "config") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.956437 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-config-out" (OuterVolumeSpecName: "config-out") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.973042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.973073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.973189 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-kube-api-access-h5njg" (OuterVolumeSpecName: "kube-api-access-h5njg") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "kube-api-access-h5njg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:25:17 crc kubenswrapper[4707]: I0129 04:25:17.975726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050315 4707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-config-out\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050352 4707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050363 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5njg\" (UniqueName: \"kubernetes.io/projected/59879132-88c3-47df-97e0-17d51326d313-kube-api-access-h5njg\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050378 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050388 4707 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050401 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050412 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050426 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050440 4707 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050452 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-db\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050462 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-config\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.050473 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/59879132-88c3-47df-97e0-17d51326d313-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.103471 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config" (OuterVolumeSpecName: "web-config") pod "59879132-88c3-47df-97e0-17d51326d313" (UID: "59879132-88c3-47df-97e0-17d51326d313"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.152549 4707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59879132-88c3-47df-97e0-17d51326d313-web-config\") on node \"crc\" DevicePath \"\"" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.824352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"59879132-88c3-47df-97e0-17d51326d313","Type":"ContainerDied","Data":"4bef02129057aa3088ef18c250de3955e550d8c0f989116026d2e9062ef2c348"} Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.824455 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.824787 4707 scope.go:117] "RemoveContainer" containerID="63a83181087476bd3335e0d653edbccdeb4e07fbe1039741e14489b44c0f875e" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.867035 4707 scope.go:117] "RemoveContainer" containerID="986bc585e35d6dc652e08b265d8f2e26005bd358b1fe3bd7d75a1fcc4b931124" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.872558 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.886950 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.892870 4707 scope.go:117] "RemoveContainer" containerID="2af0aad06c6108a37a3daefc3c0e45e21c0511b3eeab974b4189d3168d954400" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.904001 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:25:18 crc kubenswrapper[4707]: E0129 04:25:18.904581 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="thanos-sidecar" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.904600 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="thanos-sidecar" Jan 29 04:25:18 crc kubenswrapper[4707]: E0129 04:25:18.904626 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="init-config-reloader" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.904632 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="init-config-reloader" Jan 29 04:25:18 crc kubenswrapper[4707]: E0129 04:25:18.904656 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="prometheus" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.904662 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="prometheus" Jan 29 04:25:18 crc kubenswrapper[4707]: E0129 04:25:18.904673 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="config-reloader" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.904681 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="config-reloader" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.904871 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="prometheus" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.904899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="thanos-sidecar" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.904913 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="59879132-88c3-47df-97e0-17d51326d313" containerName="config-reloader" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.906638 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.910004 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.910039 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.910832 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.911711 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.912063 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.912188 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.912688 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.912873 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-k8x5b" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.921070 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.931736 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.945760 4707 scope.go:117] "RemoveContainer" containerID="226ace85b1500ff22422028c24a0dc45a7643f5047bafb12b5e02f6528f117b4" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.967946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968050 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83c2f62d-5b16-40f5-bc31-1da853f155b9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-config\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968127 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968158 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83c2f62d-5b16-40f5-bc31-1da853f155b9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:18 crc kubenswrapper[4707]: I0129 04:25:18.968312 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gskjs\" (UniqueName: \"kubernetes.io/projected/83c2f62d-5b16-40f5-bc31-1da853f155b9-kube-api-access-gskjs\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.070294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.070371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83c2f62d-5b16-40f5-bc31-1da853f155b9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.070397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-config\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071377 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83c2f62d-5b16-40f5-bc31-1da853f155b9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071436 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071484 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gskjs\" (UniqueName: \"kubernetes.io/projected/83c2f62d-5b16-40f5-bc31-1da853f155b9-kube-api-access-gskjs\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.071690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.072302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.070811 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.075581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.075734 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83c2f62d-5b16-40f5-bc31-1da853f155b9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.076128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-config\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.076890 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.078751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.080235 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83c2f62d-5b16-40f5-bc31-1da853f155b9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.080349 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.080532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.083374 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83c2f62d-5b16-40f5-bc31-1da853f155b9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.087237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83c2f62d-5b16-40f5-bc31-1da853f155b9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.100476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gskjs\" (UniqueName: \"kubernetes.io/projected/83c2f62d-5b16-40f5-bc31-1da853f155b9-kube-api-access-gskjs\") pod \"prometheus-metric-storage-0\" (UID: \"83c2f62d-5b16-40f5-bc31-1da853f155b9\") " pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.235907 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:19 crc kubenswrapper[4707]: I0129 04:25:19.262999 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59879132-88c3-47df-97e0-17d51326d313" path="/var/lib/kubelet/pods/59879132-88c3-47df-97e0-17d51326d313/volumes" Jan 29 04:25:20 crc kubenswrapper[4707]: I0129 04:25:20.210742 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 29 04:25:20 crc kubenswrapper[4707]: I0129 04:25:20.856797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83c2f62d-5b16-40f5-bc31-1da853f155b9","Type":"ContainerStarted","Data":"d7247d792920ea422497c1d669e3eb2d2c65e5451284068ae605564df071abfc"} Jan 29 04:25:24 crc kubenswrapper[4707]: I0129 04:25:24.900724 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83c2f62d-5b16-40f5-bc31-1da853f155b9","Type":"ContainerStarted","Data":"8d9a12c7a70392e370b74f637882de3807f5df4913e189aff407cdebb2acb509"} Jan 29 04:25:32 crc kubenswrapper[4707]: I0129 04:25:32.989561 4707 generic.go:334] "Generic (PLEG): container finished" podID="83c2f62d-5b16-40f5-bc31-1da853f155b9" containerID="8d9a12c7a70392e370b74f637882de3807f5df4913e189aff407cdebb2acb509" exitCode=0 Jan 29 04:25:32 crc kubenswrapper[4707]: I0129 04:25:32.989694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83c2f62d-5b16-40f5-bc31-1da853f155b9","Type":"ContainerDied","Data":"8d9a12c7a70392e370b74f637882de3807f5df4913e189aff407cdebb2acb509"} Jan 29 04:25:34 crc kubenswrapper[4707]: I0129 04:25:34.003801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83c2f62d-5b16-40f5-bc31-1da853f155b9","Type":"ContainerStarted","Data":"b4aa1f1c4a8f9df253b37632ddbb6df53bee7470bc5f6fc3fca25ad71840c7d4"} Jan 29 04:25:38 crc kubenswrapper[4707]: I0129 04:25:38.054815 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83c2f62d-5b16-40f5-bc31-1da853f155b9","Type":"ContainerStarted","Data":"f1eaa1a44f0778bab599ba7578efd724d4edf2a14e71b995615424624f4cd92c"} Jan 29 04:25:38 crc kubenswrapper[4707]: I0129 04:25:38.055642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83c2f62d-5b16-40f5-bc31-1da853f155b9","Type":"ContainerStarted","Data":"f3ce0eaabe31d8d67322b1b7e009a71646fe2aff1e79c23c603ffcf96982483e"} Jan 29 04:25:38 crc kubenswrapper[4707]: I0129 04:25:38.094139 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.094101028 podStartE2EDuration="20.094101028s" podCreationTimestamp="2026-01-29 04:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:25:38.083182481 +0000 UTC m=+3491.567411386" watchObservedRunningTime="2026-01-29 04:25:38.094101028 +0000 UTC m=+3491.578329963" Jan 29 04:25:39 crc kubenswrapper[4707]: I0129 04:25:39.236314 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:41 crc kubenswrapper[4707]: I0129 04:25:41.118114 4707 scope.go:117] "RemoveContainer" containerID="49d2cc4780ffe0932c4e0f0e9379207a54689a710b13dbc0d38c7c7d9e94bd19" Jan 29 04:25:41 crc kubenswrapper[4707]: I0129 04:25:41.161282 4707 scope.go:117] "RemoveContainer" containerID="f8f19396d6f2d620ad43f834b927a1eb9fcf835523329a5fd6cce130abc3324c" Jan 29 04:25:41 crc kubenswrapper[4707]: I0129 04:25:41.193229 4707 scope.go:117] "RemoveContainer" containerID="d56d12cadeeec88b14ea4ab85207d0e37ef39f62b3596bada68052f5e441c1fe" Jan 29 04:25:49 crc kubenswrapper[4707]: I0129 04:25:49.239755 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:49 crc kubenswrapper[4707]: I0129 04:25:49.291144 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 29 04:25:50 crc kubenswrapper[4707]: I0129 04:25:50.246439 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 29 04:26:33 crc kubenswrapper[4707]: I0129 04:26:33.463528 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:26:33 crc kubenswrapper[4707]: I0129 04:26:33.464267 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:27:03 crc kubenswrapper[4707]: I0129 04:27:03.462912 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:27:03 crc kubenswrapper[4707]: I0129 04:27:03.463497 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:27:16 crc kubenswrapper[4707]: I0129 04:27:16.309460 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7886d5cc69-w8rzq_0a32b73c-f66f-425f-81a9-ef1cc36041d4/manager/0.log" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.462805 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.463490 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.463555 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.464327 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.464374 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" gracePeriod=600 Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.551779 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4k5/must-gather-786t8"] Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.554107 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.556824 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4k4k5"/"openshift-service-ca.crt" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.556899 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4k4k5"/"default-dockercfg-tzw99" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.571367 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4k4k5"/"kube-root-ca.crt" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.587554 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4k4k5/must-gather-786t8"] Jan 29 04:27:33 crc kubenswrapper[4707]: E0129 04:27:33.606695 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.642836 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" exitCode=0 Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.642881 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b"} Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.642919 4707 scope.go:117] "RemoveContainer" containerID="3abdad2d3cfc260430421335c82eca649c20a852667da0e2e621e99916452489" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.645646 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:27:33 crc kubenswrapper[4707]: E0129 04:27:33.646103 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.696625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfd22e26-c073-47dd-b0b4-4d58d6f93522-must-gather-output\") pod \"must-gather-786t8\" (UID: \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\") " pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.696791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptr5\" (UniqueName: \"kubernetes.io/projected/dfd22e26-c073-47dd-b0b4-4d58d6f93522-kube-api-access-7ptr5\") pod \"must-gather-786t8\" (UID: \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\") " pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.799234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptr5\" (UniqueName: \"kubernetes.io/projected/dfd22e26-c073-47dd-b0b4-4d58d6f93522-kube-api-access-7ptr5\") pod \"must-gather-786t8\" (UID: \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\") " pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.799371 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfd22e26-c073-47dd-b0b4-4d58d6f93522-must-gather-output\") pod \"must-gather-786t8\" (UID: \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\") " pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.799834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfd22e26-c073-47dd-b0b4-4d58d6f93522-must-gather-output\") pod \"must-gather-786t8\" (UID: \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\") " pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.826230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptr5\" (UniqueName: \"kubernetes.io/projected/dfd22e26-c073-47dd-b0b4-4d58d6f93522-kube-api-access-7ptr5\") pod \"must-gather-786t8\" (UID: \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\") " pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:27:33 crc kubenswrapper[4707]: I0129 04:27:33.875277 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:27:34 crc kubenswrapper[4707]: I0129 04:27:34.472228 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4k4k5/must-gather-786t8"] Jan 29 04:27:34 crc kubenswrapper[4707]: I0129 04:27:34.477801 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 04:27:34 crc kubenswrapper[4707]: I0129 04:27:34.655075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4k5/must-gather-786t8" event={"ID":"dfd22e26-c073-47dd-b0b4-4d58d6f93522","Type":"ContainerStarted","Data":"660f5e809f6c9f3396452f267f11c769c86b6e49efe35de20fc49e0f486851ff"} Jan 29 04:27:41 crc kubenswrapper[4707]: I0129 04:27:41.732721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4k5/must-gather-786t8" event={"ID":"dfd22e26-c073-47dd-b0b4-4d58d6f93522","Type":"ContainerStarted","Data":"f1394075470c15b524dac2a37cdebd724b5aebc63ce6136c819249b730e8e058"} Jan 29 04:27:42 crc kubenswrapper[4707]: I0129 04:27:42.743432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4k5/must-gather-786t8" event={"ID":"dfd22e26-c073-47dd-b0b4-4d58d6f93522","Type":"ContainerStarted","Data":"1bb7fe194150d2d673f3902769beef3264893fd96c3a6793d754d049e5cb1813"} Jan 29 04:27:42 crc kubenswrapper[4707]: I0129 04:27:42.764658 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4k4k5/must-gather-786t8" podStartSLOduration=2.784860887 podStartE2EDuration="9.764639661s" podCreationTimestamp="2026-01-29 04:27:33 +0000 UTC" firstStartedPulling="2026-01-29 04:27:34.477616969 +0000 UTC m=+3607.961845874" lastFinishedPulling="2026-01-29 04:27:41.457395743 +0000 UTC m=+3614.941624648" observedRunningTime="2026-01-29 04:27:42.764066225 +0000 UTC m=+3616.248295130" watchObservedRunningTime="2026-01-29 04:27:42.764639661 +0000 UTC m=+3616.248868566" Jan 29 04:27:45 crc kubenswrapper[4707]: I0129 04:27:45.244126 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:27:45 crc kubenswrapper[4707]: E0129 04:27:45.244866 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.356752 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4k5/crc-debug-cqgts"] Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.358766 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.424906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vmdb\" (UniqueName: \"kubernetes.io/projected/53b148b8-b9e0-4b94-ab33-22f485485381-kube-api-access-8vmdb\") pod \"crc-debug-cqgts\" (UID: \"53b148b8-b9e0-4b94-ab33-22f485485381\") " pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.424972 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53b148b8-b9e0-4b94-ab33-22f485485381-host\") pod \"crc-debug-cqgts\" (UID: \"53b148b8-b9e0-4b94-ab33-22f485485381\") " pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.526964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vmdb\" (UniqueName: \"kubernetes.io/projected/53b148b8-b9e0-4b94-ab33-22f485485381-kube-api-access-8vmdb\") pod \"crc-debug-cqgts\" (UID: \"53b148b8-b9e0-4b94-ab33-22f485485381\") " pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.527022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53b148b8-b9e0-4b94-ab33-22f485485381-host\") pod \"crc-debug-cqgts\" (UID: \"53b148b8-b9e0-4b94-ab33-22f485485381\") " pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.527193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53b148b8-b9e0-4b94-ab33-22f485485381-host\") pod \"crc-debug-cqgts\" (UID: \"53b148b8-b9e0-4b94-ab33-22f485485381\") " pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.552713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vmdb\" (UniqueName: \"kubernetes.io/projected/53b148b8-b9e0-4b94-ab33-22f485485381-kube-api-access-8vmdb\") pod \"crc-debug-cqgts\" (UID: \"53b148b8-b9e0-4b94-ab33-22f485485381\") " pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.687362 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:27:47 crc kubenswrapper[4707]: I0129 04:27:47.804234 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4k5/crc-debug-cqgts" event={"ID":"53b148b8-b9e0-4b94-ab33-22f485485381","Type":"ContainerStarted","Data":"93725b9a269bd0789407b012bc4b8eb50e7837d9059ad2599b434dc4d0fa6e57"} Jan 29 04:27:57 crc kubenswrapper[4707]: I0129 04:27:57.255046 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:27:57 crc kubenswrapper[4707]: E0129 04:27:57.256010 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:28:00 crc kubenswrapper[4707]: I0129 04:28:00.951956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4k5/crc-debug-cqgts" event={"ID":"53b148b8-b9e0-4b94-ab33-22f485485381","Type":"ContainerStarted","Data":"d7a338f1c2ec18a7052b453e14851da48c1f9e3f59c0e7551a22d3b0ac43343d"} Jan 29 04:28:00 crc kubenswrapper[4707]: I0129 04:28:00.971518 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4k4k5/crc-debug-cqgts" podStartSLOduration=1.798278248 podStartE2EDuration="13.971498566s" podCreationTimestamp="2026-01-29 04:27:47 +0000 UTC" firstStartedPulling="2026-01-29 04:27:47.760828236 +0000 UTC m=+3621.245057141" lastFinishedPulling="2026-01-29 04:27:59.934048554 +0000 UTC m=+3633.418277459" observedRunningTime="2026-01-29 04:28:00.970325453 +0000 UTC m=+3634.454554368" watchObservedRunningTime="2026-01-29 04:28:00.971498566 +0000 UTC m=+3634.455727481" Jan 29 04:28:09 crc kubenswrapper[4707]: I0129 04:28:09.246231 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:28:09 crc kubenswrapper[4707]: E0129 04:28:09.247412 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:28:21 crc kubenswrapper[4707]: I0129 04:28:21.129401 4707 generic.go:334] "Generic (PLEG): container finished" podID="53b148b8-b9e0-4b94-ab33-22f485485381" containerID="d7a338f1c2ec18a7052b453e14851da48c1f9e3f59c0e7551a22d3b0ac43343d" exitCode=0 Jan 29 04:28:21 crc kubenswrapper[4707]: I0129 04:28:21.129482 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4k5/crc-debug-cqgts" event={"ID":"53b148b8-b9e0-4b94-ab33-22f485485381","Type":"ContainerDied","Data":"d7a338f1c2ec18a7052b453e14851da48c1f9e3f59c0e7551a22d3b0ac43343d"} Jan 29 04:28:22 crc kubenswrapper[4707]: I0129 04:28:22.264293 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:28:22 crc kubenswrapper[4707]: I0129 04:28:22.326586 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4k5/crc-debug-cqgts"] Jan 29 04:28:22 crc kubenswrapper[4707]: I0129 04:28:22.336688 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4k5/crc-debug-cqgts"] Jan 29 04:28:22 crc kubenswrapper[4707]: I0129 04:28:22.439179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53b148b8-b9e0-4b94-ab33-22f485485381-host\") pod \"53b148b8-b9e0-4b94-ab33-22f485485381\" (UID: \"53b148b8-b9e0-4b94-ab33-22f485485381\") " Jan 29 04:28:22 crc kubenswrapper[4707]: I0129 04:28:22.439285 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53b148b8-b9e0-4b94-ab33-22f485485381-host" (OuterVolumeSpecName: "host") pod "53b148b8-b9e0-4b94-ab33-22f485485381" (UID: "53b148b8-b9e0-4b94-ab33-22f485485381"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 04:28:22 crc kubenswrapper[4707]: I0129 04:28:22.439350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vmdb\" (UniqueName: \"kubernetes.io/projected/53b148b8-b9e0-4b94-ab33-22f485485381-kube-api-access-8vmdb\") pod \"53b148b8-b9e0-4b94-ab33-22f485485381\" (UID: \"53b148b8-b9e0-4b94-ab33-22f485485381\") " Jan 29 04:28:22 crc kubenswrapper[4707]: I0129 04:28:22.439904 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53b148b8-b9e0-4b94-ab33-22f485485381-host\") on node \"crc\" DevicePath \"\"" Jan 29 04:28:22 crc kubenswrapper[4707]: I0129 04:28:22.445411 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b148b8-b9e0-4b94-ab33-22f485485381-kube-api-access-8vmdb" (OuterVolumeSpecName: "kube-api-access-8vmdb") pod "53b148b8-b9e0-4b94-ab33-22f485485381" (UID: "53b148b8-b9e0-4b94-ab33-22f485485381"). InnerVolumeSpecName "kube-api-access-8vmdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:28:22 crc kubenswrapper[4707]: I0129 04:28:22.542414 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vmdb\" (UniqueName: \"kubernetes.io/projected/53b148b8-b9e0-4b94-ab33-22f485485381-kube-api-access-8vmdb\") on node \"crc\" DevicePath \"\"" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.151959 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93725b9a269bd0789407b012bc4b8eb50e7837d9059ad2599b434dc4d0fa6e57" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.152022 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/crc-debug-cqgts" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.243824 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:28:23 crc kubenswrapper[4707]: E0129 04:28:23.244282 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.254171 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53b148b8-b9e0-4b94-ab33-22f485485381" path="/var/lib/kubelet/pods/53b148b8-b9e0-4b94-ab33-22f485485381/volumes" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.546853 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4k5/crc-debug-kn26g"] Jan 29 04:28:23 crc kubenswrapper[4707]: E0129 04:28:23.547287 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b148b8-b9e0-4b94-ab33-22f485485381" containerName="container-00" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.547300 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b148b8-b9e0-4b94-ab33-22f485485381" containerName="container-00" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.547505 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b148b8-b9e0-4b94-ab33-22f485485381" containerName="container-00" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.548221 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.568523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bf91dd7-80ac-4b87-b2b6-d853054a57df-host\") pod \"crc-debug-kn26g\" (UID: \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\") " pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.568705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hg7j\" (UniqueName: \"kubernetes.io/projected/3bf91dd7-80ac-4b87-b2b6-d853054a57df-kube-api-access-5hg7j\") pod \"crc-debug-kn26g\" (UID: \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\") " pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.670528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hg7j\" (UniqueName: \"kubernetes.io/projected/3bf91dd7-80ac-4b87-b2b6-d853054a57df-kube-api-access-5hg7j\") pod \"crc-debug-kn26g\" (UID: \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\") " pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.671174 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bf91dd7-80ac-4b87-b2b6-d853054a57df-host\") pod \"crc-debug-kn26g\" (UID: \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\") " pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.671307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bf91dd7-80ac-4b87-b2b6-d853054a57df-host\") pod \"crc-debug-kn26g\" (UID: \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\") " pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.689984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hg7j\" (UniqueName: \"kubernetes.io/projected/3bf91dd7-80ac-4b87-b2b6-d853054a57df-kube-api-access-5hg7j\") pod \"crc-debug-kn26g\" (UID: \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\") " pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:23 crc kubenswrapper[4707]: I0129 04:28:23.873059 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:24 crc kubenswrapper[4707]: I0129 04:28:24.163824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4k5/crc-debug-kn26g" event={"ID":"3bf91dd7-80ac-4b87-b2b6-d853054a57df","Type":"ContainerStarted","Data":"91b69b8532e559fd69fb1c8229eeb1cef428e3b9f5e377cf60a2abcf930a21e3"} Jan 29 04:28:25 crc kubenswrapper[4707]: I0129 04:28:25.177396 4707 generic.go:334] "Generic (PLEG): container finished" podID="3bf91dd7-80ac-4b87-b2b6-d853054a57df" containerID="8cf0295a37fd4f1befa905f6290641286c9c1527dc499c077bce0277a0a837ba" exitCode=1 Jan 29 04:28:25 crc kubenswrapper[4707]: I0129 04:28:25.177529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4k5/crc-debug-kn26g" event={"ID":"3bf91dd7-80ac-4b87-b2b6-d853054a57df","Type":"ContainerDied","Data":"8cf0295a37fd4f1befa905f6290641286c9c1527dc499c077bce0277a0a837ba"} Jan 29 04:28:25 crc kubenswrapper[4707]: I0129 04:28:25.229061 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4k5/crc-debug-kn26g"] Jan 29 04:28:25 crc kubenswrapper[4707]: I0129 04:28:25.255793 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4k5/crc-debug-kn26g"] Jan 29 04:28:26 crc kubenswrapper[4707]: I0129 04:28:26.295424 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:26 crc kubenswrapper[4707]: I0129 04:28:26.436890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bf91dd7-80ac-4b87-b2b6-d853054a57df-host\") pod \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\" (UID: \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\") " Jan 29 04:28:26 crc kubenswrapper[4707]: I0129 04:28:26.437037 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3bf91dd7-80ac-4b87-b2b6-d853054a57df-host" (OuterVolumeSpecName: "host") pod "3bf91dd7-80ac-4b87-b2b6-d853054a57df" (UID: "3bf91dd7-80ac-4b87-b2b6-d853054a57df"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 04:28:26 crc kubenswrapper[4707]: I0129 04:28:26.437321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hg7j\" (UniqueName: \"kubernetes.io/projected/3bf91dd7-80ac-4b87-b2b6-d853054a57df-kube-api-access-5hg7j\") pod \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\" (UID: \"3bf91dd7-80ac-4b87-b2b6-d853054a57df\") " Jan 29 04:28:26 crc kubenswrapper[4707]: I0129 04:28:26.439304 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bf91dd7-80ac-4b87-b2b6-d853054a57df-host\") on node \"crc\" DevicePath \"\"" Jan 29 04:28:26 crc kubenswrapper[4707]: I0129 04:28:26.446477 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf91dd7-80ac-4b87-b2b6-d853054a57df-kube-api-access-5hg7j" (OuterVolumeSpecName: "kube-api-access-5hg7j") pod "3bf91dd7-80ac-4b87-b2b6-d853054a57df" (UID: "3bf91dd7-80ac-4b87-b2b6-d853054a57df"). InnerVolumeSpecName "kube-api-access-5hg7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:28:26 crc kubenswrapper[4707]: I0129 04:28:26.540955 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hg7j\" (UniqueName: \"kubernetes.io/projected/3bf91dd7-80ac-4b87-b2b6-d853054a57df-kube-api-access-5hg7j\") on node \"crc\" DevicePath \"\"" Jan 29 04:28:27 crc kubenswrapper[4707]: I0129 04:28:27.197327 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91b69b8532e559fd69fb1c8229eeb1cef428e3b9f5e377cf60a2abcf930a21e3" Jan 29 04:28:27 crc kubenswrapper[4707]: I0129 04:28:27.197402 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/crc-debug-kn26g" Jan 29 04:28:27 crc kubenswrapper[4707]: I0129 04:28:27.257293 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf91dd7-80ac-4b87-b2b6-d853054a57df" path="/var/lib/kubelet/pods/3bf91dd7-80ac-4b87-b2b6-d853054a57df/volumes" Jan 29 04:28:36 crc kubenswrapper[4707]: I0129 04:28:36.244002 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:28:36 crc kubenswrapper[4707]: E0129 04:28:36.245049 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:28:51 crc kubenswrapper[4707]: I0129 04:28:51.245439 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:28:51 crc kubenswrapper[4707]: E0129 04:28:51.246497 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:29:04 crc kubenswrapper[4707]: I0129 04:29:04.243699 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:29:04 crc kubenswrapper[4707]: E0129 04:29:04.244917 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:29:17 crc kubenswrapper[4707]: I0129 04:29:17.252624 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:29:17 crc kubenswrapper[4707]: E0129 04:29:17.254013 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:29:22 crc kubenswrapper[4707]: I0129 04:29:22.835064 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_21d0ba1c-ab07-48da-8e34-93da9d1c9c6a/init-config-reloader/0.log" Jan 29 04:29:22 crc kubenswrapper[4707]: I0129 04:29:22.977836 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_21d0ba1c-ab07-48da-8e34-93da9d1c9c6a/init-config-reloader/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.043020 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_21d0ba1c-ab07-48da-8e34-93da9d1c9c6a/alertmanager/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.049467 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_21d0ba1c-ab07-48da-8e34-93da9d1c9c6a/config-reloader/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.235961 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_1c46106d-ca5d-4cac-820f-cf2935abf6d8/aodh-api/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.276187 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_1c46106d-ca5d-4cac-820f-cf2935abf6d8/aodh-listener/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.289806 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_1c46106d-ca5d-4cac-820f-cf2935abf6d8/aodh-evaluator/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.406852 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_1c46106d-ca5d-4cac-820f-cf2935abf6d8/aodh-notifier/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.499806 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6cf5fb45fd-lqs99_27a75d55-3866-4c55-bffb-8f1f1d53b687/barbican-api/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.517671 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6cf5fb45fd-lqs99_27a75d55-3866-4c55-bffb-8f1f1d53b687/barbican-api-log/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.683088 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7665bc55c6-vnwk8_374bee16-aeed-4b53-845a-494375d065f6/barbican-keystone-listener/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.745428 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7665bc55c6-vnwk8_374bee16-aeed-4b53-845a-494375d065f6/barbican-keystone-listener-log/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.830570 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-78bfcf785f-txfhz_cd6d292d-51ed-4b96-89e1-06220cd5f98b/barbican-worker/0.log" Jan 29 04:29:23 crc kubenswrapper[4707]: I0129 04:29:23.903746 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-78bfcf785f-txfhz_cd6d292d-51ed-4b96-89e1-06220cd5f98b/barbican-worker-log/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.126001 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7c20a992-d535-46ad-9cc4-f2348c18f7ca/ceilometer-central-agent/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.135080 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz_4ed3ca47-cf57-4534-b12c-2aa6c2be26cd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.248401 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7c20a992-d535-46ad-9cc4-f2348c18f7ca/ceilometer-notification-agent/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.359265 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7c20a992-d535-46ad-9cc4-f2348c18f7ca/proxy-httpd/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.366016 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7c20a992-d535-46ad-9cc4-f2348c18f7ca/sg-core/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.565909 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b270d5b9-7b1a-44b2-b915-4f63e06a10eb/cinder-api/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.613477 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b270d5b9-7b1a-44b2-b915-4f63e06a10eb/cinder-api-log/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.817006 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_539d5b33-91ee-4790-941f-22c82388ed87/cinder-scheduler/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.860360 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_539d5b33-91ee-4790-941f-22c82388ed87/probe/0.log" Jan 29 04:29:24 crc kubenswrapper[4707]: I0129 04:29:24.886028 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7_98ca67a4-de19-4954-a988-1c743df160cd/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:25 crc kubenswrapper[4707]: I0129 04:29:25.113881 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2_933d5dc9-d255-45c9-837d-251701e8fd77/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:25 crc kubenswrapper[4707]: I0129 04:29:25.149802 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-4wd7q_fdf17e92-84cc-4d06-ba4f-714cfd41c134/init/0.log" Jan 29 04:29:25 crc kubenswrapper[4707]: I0129 04:29:25.429111 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-4wd7q_fdf17e92-84cc-4d06-ba4f-714cfd41c134/init/0.log" Jan 29 04:29:25 crc kubenswrapper[4707]: I0129 04:29:25.509366 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-4wd7q_fdf17e92-84cc-4d06-ba4f-714cfd41c134/dnsmasq-dns/0.log" Jan 29 04:29:25 crc kubenswrapper[4707]: I0129 04:29:25.541105 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt_43b1dffd-18d4-4201-9a3f-5ef4db33c8b7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:25 crc kubenswrapper[4707]: I0129 04:29:25.717316 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_46ce0794-979b-4f4c-9a41-b895bbc25d0c/glance-log/0.log" Jan 29 04:29:25 crc kubenswrapper[4707]: I0129 04:29:25.731917 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_46ce0794-979b-4f4c-9a41-b895bbc25d0c/glance-httpd/0.log" Jan 29 04:29:25 crc kubenswrapper[4707]: I0129 04:29:25.889045 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9a1690ce-5c45-4a23-abd5-a1521acd3f82/glance-httpd/0.log" Jan 29 04:29:25 crc kubenswrapper[4707]: I0129 04:29:25.949781 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9a1690ce-5c45-4a23-abd5-a1521acd3f82/glance-log/0.log" Jan 29 04:29:26 crc kubenswrapper[4707]: I0129 04:29:26.404728 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-74d4777d5f-4mj7v_9b19f31f-481f-4feb-91bb-09df20de5654/heat-api/0.log" Jan 29 04:29:26 crc kubenswrapper[4707]: I0129 04:29:26.536027 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-7b6695978c-nxvfw_52ddee41-4c3c-4b7b-b637-2de751496d37/heat-engine/0.log" Jan 29 04:29:26 crc kubenswrapper[4707]: I0129 04:29:26.573481 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-78c45ff765-hw8sk_0f27421c-79af-4e0d-b97f-c1d73b2524e2/heat-cfnapi/0.log" Jan 29 04:29:26 crc kubenswrapper[4707]: I0129 04:29:26.662898 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p_945bd58d-5ea2-4118-a675-3b7b127d9d4c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:26 crc kubenswrapper[4707]: I0129 04:29:26.907180 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-b4b42_1503434a-951a-4e31-836e-c1f37b794d45/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:27 crc kubenswrapper[4707]: I0129 04:29:27.095149 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6568fdcd45-j5nxz_0542ad30-5c42-4464-83b9-3faebd15a9ea/keystone-api/0.log" Jan 29 04:29:27 crc kubenswrapper[4707]: I0129 04:29:27.115517 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29494321-prr9g_b1afb9c0-b9e9-46d1-b608-36148c671d74/keystone-cron/0.log" Jan 29 04:29:27 crc kubenswrapper[4707]: I0129 04:29:27.187846 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6f586474-1963-4702-81bf-36d31bf0a3ae/kube-state-metrics/0.log" Jan 29 04:29:27 crc kubenswrapper[4707]: I0129 04:29:27.330184 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz_a019e4eb-4ee9-4426-bde0-9c6b0319283f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:27 crc kubenswrapper[4707]: I0129 04:29:27.606408 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f8ffb664f-gwtlc_4d553753-4701-4a28-81dd-f7d0fbe719d6/neutron-api/0.log" Jan 29 04:29:27 crc kubenswrapper[4707]: I0129 04:29:27.686140 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f8ffb664f-gwtlc_4d553753-4701-4a28-81dd-f7d0fbe719d6/neutron-httpd/0.log" Jan 29 04:29:27 crc kubenswrapper[4707]: I0129 04:29:27.786913 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n_9f02add7-c3ef-4952-b83f-1799bf08bad0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:28 crc kubenswrapper[4707]: I0129 04:29:28.124467 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ce279671-2df3-4af7-a6bb-2ac9fdc048da/nova-api-log/0.log" Jan 29 04:29:28 crc kubenswrapper[4707]: I0129 04:29:28.260125 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_66aa645d-9edf-4791-a8f7-2607ad442104/nova-cell0-conductor-conductor/0.log" Jan 29 04:29:28 crc kubenswrapper[4707]: I0129 04:29:28.412041 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ce279671-2df3-4af7-a6bb-2ac9fdc048da/nova-api-api/0.log" Jan 29 04:29:28 crc kubenswrapper[4707]: I0129 04:29:28.516225 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_47acef2f-13c3-47cd-b61f-a65e20f570a4/nova-cell1-conductor-conductor/0.log" Jan 29 04:29:28 crc kubenswrapper[4707]: I0129 04:29:28.770256 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_99c8a2aa-31c2-4927-af04-8f5e8c50198e/nova-cell1-novncproxy-novncproxy/0.log" Jan 29 04:29:28 crc kubenswrapper[4707]: I0129 04:29:28.867228 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-2b5nk_018b06ef-5822-4b5e-ae32-43bf56e40f19/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:29 crc kubenswrapper[4707]: I0129 04:29:29.244738 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:29:29 crc kubenswrapper[4707]: E0129 04:29:29.245528 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:29:29 crc kubenswrapper[4707]: I0129 04:29:29.392138 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df23cf25-bfda-4999-85bf-ef4af0738ece/nova-metadata-log/0.log" Jan 29 04:29:29 crc kubenswrapper[4707]: I0129 04:29:29.520922 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7164be40-3659-450f-885f-db200baa5ed2/nova-scheduler-scheduler/0.log" Jan 29 04:29:29 crc kubenswrapper[4707]: I0129 04:29:29.747044 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6aaab4a-5490-4d22-ac2a-e346a1371683/mysql-bootstrap/0.log" Jan 29 04:29:29 crc kubenswrapper[4707]: I0129 04:29:29.955599 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6aaab4a-5490-4d22-ac2a-e346a1371683/galera/0.log" Jan 29 04:29:29 crc kubenswrapper[4707]: I0129 04:29:29.962055 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6aaab4a-5490-4d22-ac2a-e346a1371683/mysql-bootstrap/0.log" Jan 29 04:29:30 crc kubenswrapper[4707]: I0129 04:29:30.180892 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8dbb64e8-99fc-4b59-abdc-fce36a90b82f/mysql-bootstrap/0.log" Jan 29 04:29:30 crc kubenswrapper[4707]: I0129 04:29:30.390916 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8dbb64e8-99fc-4b59-abdc-fce36a90b82f/mysql-bootstrap/0.log" Jan 29 04:29:30 crc kubenswrapper[4707]: I0129 04:29:30.530088 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8dbb64e8-99fc-4b59-abdc-fce36a90b82f/galera/0.log" Jan 29 04:29:30 crc kubenswrapper[4707]: I0129 04:29:30.646897 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2afcd46a-a1c0-41cf-866e-3a39e0ac9a36/openstackclient/0.log" Jan 29 04:29:30 crc kubenswrapper[4707]: I0129 04:29:30.786166 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hpq5q_9f831116-140a-4c6b-8d7c-aad99fcaf97c/ovn-controller/0.log" Jan 29 04:29:30 crc kubenswrapper[4707]: I0129 04:29:30.870958 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df23cf25-bfda-4999-85bf-ef4af0738ece/nova-metadata-metadata/0.log" Jan 29 04:29:31 crc kubenswrapper[4707]: I0129 04:29:31.022581 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6rw89_3bc182e0-848e-43b3-8d1e-920440755bca/openstack-network-exporter/0.log" Jan 29 04:29:31 crc kubenswrapper[4707]: I0129 04:29:31.185354 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hxz2d_f25cc401-b568-4936-9947-2a54b5f6dea9/ovsdb-server-init/0.log" Jan 29 04:29:31 crc kubenswrapper[4707]: I0129 04:29:31.355141 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hxz2d_f25cc401-b568-4936-9947-2a54b5f6dea9/ovsdb-server-init/0.log" Jan 29 04:29:31 crc kubenswrapper[4707]: I0129 04:29:31.393228 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hxz2d_f25cc401-b568-4936-9947-2a54b5f6dea9/ovs-vswitchd/0.log" Jan 29 04:29:31 crc kubenswrapper[4707]: I0129 04:29:31.396923 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hxz2d_f25cc401-b568-4936-9947-2a54b5f6dea9/ovsdb-server/0.log" Jan 29 04:29:31 crc kubenswrapper[4707]: I0129 04:29:31.669628 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_79b44fdd-6478-42a0-9817-b3d949683532/openstack-network-exporter/0.log" Jan 29 04:29:31 crc kubenswrapper[4707]: I0129 04:29:31.735848 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bzfvj_80667caf-0ec4-4178-96b2-93b148db9c1e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:31 crc kubenswrapper[4707]: I0129 04:29:31.797183 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_79b44fdd-6478-42a0-9817-b3d949683532/ovn-northd/0.log" Jan 29 04:29:31 crc kubenswrapper[4707]: I0129 04:29:31.993323 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_445c0ce8-31bb-4f8a-a139-e1d7a63d38f7/openstack-network-exporter/0.log" Jan 29 04:29:32 crc kubenswrapper[4707]: I0129 04:29:32.050096 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_445c0ce8-31bb-4f8a-a139-e1d7a63d38f7/ovsdbserver-nb/0.log" Jan 29 04:29:32 crc kubenswrapper[4707]: I0129 04:29:32.290679 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b5dee206-46c6-44c4-885d-0d8ba9149bfd/ovsdbserver-sb/0.log" Jan 29 04:29:32 crc kubenswrapper[4707]: I0129 04:29:32.381401 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b5dee206-46c6-44c4-885d-0d8ba9149bfd/openstack-network-exporter/0.log" Jan 29 04:29:32 crc kubenswrapper[4707]: I0129 04:29:32.420802 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9f459ff7d-tkv2s_2ceecbc6-bf80-4008-80e3-0a43426cf4c6/placement-api/0.log" Jan 29 04:29:32 crc kubenswrapper[4707]: I0129 04:29:32.650510 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9f459ff7d-tkv2s_2ceecbc6-bf80-4008-80e3-0a43426cf4c6/placement-log/0.log" Jan 29 04:29:32 crc kubenswrapper[4707]: I0129 04:29:32.746617 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/init-config-reloader/0.log" Jan 29 04:29:32 crc kubenswrapper[4707]: I0129 04:29:32.922618 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/init-config-reloader/0.log" Jan 29 04:29:32 crc kubenswrapper[4707]: I0129 04:29:32.946632 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/config-reloader/0.log" Jan 29 04:29:33 crc kubenswrapper[4707]: I0129 04:29:33.003801 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/prometheus/0.log" Jan 29 04:29:33 crc kubenswrapper[4707]: I0129 04:29:33.051456 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/thanos-sidecar/0.log" Jan 29 04:29:33 crc kubenswrapper[4707]: I0129 04:29:33.219529 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_688208b9-5567-4a47-9ec9-76ce03ec8991/setup-container/0.log" Jan 29 04:29:33 crc kubenswrapper[4707]: I0129 04:29:33.516142 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_688208b9-5567-4a47-9ec9-76ce03ec8991/rabbitmq/0.log" Jan 29 04:29:33 crc kubenswrapper[4707]: I0129 04:29:33.518161 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_688208b9-5567-4a47-9ec9-76ce03ec8991/setup-container/0.log" Jan 29 04:29:33 crc kubenswrapper[4707]: I0129 04:29:33.563247 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fb587ed3-9015-4748-a28b-10d4132ffdfb/setup-container/0.log" Jan 29 04:29:34 crc kubenswrapper[4707]: I0129 04:29:34.040359 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fb587ed3-9015-4748-a28b-10d4132ffdfb/setup-container/0.log" Jan 29 04:29:34 crc kubenswrapper[4707]: I0129 04:29:34.065232 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fb587ed3-9015-4748-a28b-10d4132ffdfb/rabbitmq/0.log" Jan 29 04:29:34 crc kubenswrapper[4707]: I0129 04:29:34.220270 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm_c932c837-2020-4db4-8598-c4803eff8029/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:34 crc kubenswrapper[4707]: I0129 04:29:34.332092 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-5nrs9_7b03e702-6a8f-4bcc-8be0-4ec0eaf53900/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:34 crc kubenswrapper[4707]: I0129 04:29:34.519484 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s_3b1ed1fd-6748-40d9-b458-ca63cb4479e0/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:34 crc kubenswrapper[4707]: I0129 04:29:34.609028 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-ncft6_21dfe5ce-4935-46c4-8124-cb4fecf0a906/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:34 crc kubenswrapper[4707]: I0129 04:29:34.821655 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-r7f6d_c8e32f79-d8b0-424c-a23e-fe94623016de/ssh-known-hosts-edpm-deployment/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.056359 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67b9cbc75f-dv5cr_57f35f5f-1517-41b4-b354-59fd90d8fea5/proxy-httpd/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.116152 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67b9cbc75f-dv5cr_57f35f5f-1517-41b4-b354-59fd90d8fea5/proxy-server/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.209756 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-bxs5q_b9e5582d-dd71-4ccd-84ea-bc133dce917c/swift-ring-rebalance/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.408949 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/account-reaper/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.415982 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/account-auditor/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.498993 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/account-replicator/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.671909 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/account-server/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.680694 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/container-auditor/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.703775 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/container-replicator/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.757438 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/container-server/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.861829 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/container-updater/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.932783 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-auditor/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.961140 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-expirer/0.log" Jan 29 04:29:35 crc kubenswrapper[4707]: I0129 04:29:35.970928 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-replicator/0.log" Jan 29 04:29:36 crc kubenswrapper[4707]: I0129 04:29:36.112729 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-server/0.log" Jan 29 04:29:36 crc kubenswrapper[4707]: I0129 04:29:36.281062 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-updater/0.log" Jan 29 04:29:36 crc kubenswrapper[4707]: I0129 04:29:36.305559 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/swift-recon-cron/0.log" Jan 29 04:29:36 crc kubenswrapper[4707]: I0129 04:29:36.416142 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/rsync/0.log" Jan 29 04:29:36 crc kubenswrapper[4707]: I0129 04:29:36.645341 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r_b8c300e6-01c5-493d-b263-2b6cdfaba0c9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:36 crc kubenswrapper[4707]: I0129 04:29:36.704040 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-892tt_bf895266-d1e5-47d5-8d3a-397acefb3f9b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:29:42 crc kubenswrapper[4707]: I0129 04:29:42.244820 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:29:42 crc kubenswrapper[4707]: E0129 04:29:42.245769 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:29:43 crc kubenswrapper[4707]: I0129 04:29:43.784277 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_33afc350-9c09-4d5f-aa86-80ccc0b670ba/memcached/0.log" Jan 29 04:29:57 crc kubenswrapper[4707]: I0129 04:29:57.250054 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:29:57 crc kubenswrapper[4707]: E0129 04:29:57.250909 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.180603 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9"] Jan 29 04:30:00 crc kubenswrapper[4707]: E0129 04:30:00.181889 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf91dd7-80ac-4b87-b2b6-d853054a57df" containerName="container-00" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.181935 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf91dd7-80ac-4b87-b2b6-d853054a57df" containerName="container-00" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.182184 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf91dd7-80ac-4b87-b2b6-d853054a57df" containerName="container-00" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.183019 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.185903 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.186264 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.203465 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9"] Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.266867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-config-volume\") pod \"collect-profiles-29494350-bxxb9\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.267028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-secret-volume\") pod \"collect-profiles-29494350-bxxb9\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.267211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgbc7\" (UniqueName: \"kubernetes.io/projected/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-kube-api-access-xgbc7\") pod \"collect-profiles-29494350-bxxb9\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.372139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-secret-volume\") pod \"collect-profiles-29494350-bxxb9\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.372290 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgbc7\" (UniqueName: \"kubernetes.io/projected/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-kube-api-access-xgbc7\") pod \"collect-profiles-29494350-bxxb9\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.372368 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-config-volume\") pod \"collect-profiles-29494350-bxxb9\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.373502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-config-volume\") pod \"collect-profiles-29494350-bxxb9\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.383175 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-secret-volume\") pod \"collect-profiles-29494350-bxxb9\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.396474 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgbc7\" (UniqueName: \"kubernetes.io/projected/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-kube-api-access-xgbc7\") pod \"collect-profiles-29494350-bxxb9\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:00 crc kubenswrapper[4707]: I0129 04:30:00.510148 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:01 crc kubenswrapper[4707]: I0129 04:30:01.030154 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9"] Jan 29 04:30:01 crc kubenswrapper[4707]: I0129 04:30:01.242028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" event={"ID":"120f3dcc-4481-4a2f-9cbc-373bc37af3a1","Type":"ContainerStarted","Data":"8c1e0c24c56a07bf1326c9d3d03fb102f65a86345952f540a3f919e8d74ff94c"} Jan 29 04:30:02 crc kubenswrapper[4707]: I0129 04:30:02.253844 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" event={"ID":"120f3dcc-4481-4a2f-9cbc-373bc37af3a1","Type":"ContainerStarted","Data":"3811902c5353aa03ac54d541678b8ed9da3d902a4ed57e007a465fa2553a8a25"} Jan 29 04:30:02 crc kubenswrapper[4707]: I0129 04:30:02.278350 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" podStartSLOduration=2.278332253 podStartE2EDuration="2.278332253s" podCreationTimestamp="2026-01-29 04:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:30:02.274192347 +0000 UTC m=+3755.758421252" watchObservedRunningTime="2026-01-29 04:30:02.278332253 +0000 UTC m=+3755.762561158" Jan 29 04:30:03 crc kubenswrapper[4707]: I0129 04:30:03.274581 4707 generic.go:334] "Generic (PLEG): container finished" podID="120f3dcc-4481-4a2f-9cbc-373bc37af3a1" containerID="3811902c5353aa03ac54d541678b8ed9da3d902a4ed57e007a465fa2553a8a25" exitCode=0 Jan 29 04:30:03 crc kubenswrapper[4707]: I0129 04:30:03.274652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" event={"ID":"120f3dcc-4481-4a2f-9cbc-373bc37af3a1","Type":"ContainerDied","Data":"3811902c5353aa03ac54d541678b8ed9da3d902a4ed57e007a465fa2553a8a25"} Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.674190 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.873632 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-secret-volume\") pod \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.873982 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-config-volume\") pod \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.874363 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgbc7\" (UniqueName: \"kubernetes.io/projected/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-kube-api-access-xgbc7\") pod \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\" (UID: \"120f3dcc-4481-4a2f-9cbc-373bc37af3a1\") " Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.874801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-config-volume" (OuterVolumeSpecName: "config-volume") pod "120f3dcc-4481-4a2f-9cbc-373bc37af3a1" (UID: "120f3dcc-4481-4a2f-9cbc-373bc37af3a1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.875960 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.883919 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "120f3dcc-4481-4a2f-9cbc-373bc37af3a1" (UID: "120f3dcc-4481-4a2f-9cbc-373bc37af3a1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.885212 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-kube-api-access-xgbc7" (OuterVolumeSpecName: "kube-api-access-xgbc7") pod "120f3dcc-4481-4a2f-9cbc-373bc37af3a1" (UID: "120f3dcc-4481-4a2f-9cbc-373bc37af3a1"). InnerVolumeSpecName "kube-api-access-xgbc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.977518 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgbc7\" (UniqueName: \"kubernetes.io/projected/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-kube-api-access-xgbc7\") on node \"crc\" DevicePath \"\"" Jan 29 04:30:04 crc kubenswrapper[4707]: I0129 04:30:04.977617 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/120f3dcc-4481-4a2f-9cbc-373bc37af3a1-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 04:30:05 crc kubenswrapper[4707]: I0129 04:30:05.305326 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" event={"ID":"120f3dcc-4481-4a2f-9cbc-373bc37af3a1","Type":"ContainerDied","Data":"8c1e0c24c56a07bf1326c9d3d03fb102f65a86345952f540a3f919e8d74ff94c"} Jan 29 04:30:05 crc kubenswrapper[4707]: I0129 04:30:05.305388 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c1e0c24c56a07bf1326c9d3d03fb102f65a86345952f540a3f919e8d74ff94c" Jan 29 04:30:05 crc kubenswrapper[4707]: I0129 04:30:05.305700 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494350-bxxb9" Jan 29 04:30:05 crc kubenswrapper[4707]: I0129 04:30:05.380678 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w"] Jan 29 04:30:05 crc kubenswrapper[4707]: I0129 04:30:05.389768 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494305-62h5w"] Jan 29 04:30:07 crc kubenswrapper[4707]: I0129 04:30:07.258578 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe0e902-d9e3-4b02-b1d9-6f187a075acf" path="/var/lib/kubelet/pods/afe0e902-d9e3-4b02-b1d9-6f187a075acf/volumes" Jan 29 04:30:08 crc kubenswrapper[4707]: I0129 04:30:08.807706 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/util/0.log" Jan 29 04:30:08 crc kubenswrapper[4707]: I0129 04:30:08.975120 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/pull/0.log" Jan 29 04:30:08 crc kubenswrapper[4707]: I0129 04:30:08.993969 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/util/0.log" Jan 29 04:30:09 crc kubenswrapper[4707]: I0129 04:30:09.074730 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/pull/0.log" Jan 29 04:30:09 crc kubenswrapper[4707]: I0129 04:30:09.208614 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/extract/0.log" Jan 29 04:30:09 crc kubenswrapper[4707]: I0129 04:30:09.245522 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/pull/0.log" Jan 29 04:30:09 crc kubenswrapper[4707]: I0129 04:30:09.253170 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/util/0.log" Jan 29 04:30:09 crc kubenswrapper[4707]: I0129 04:30:09.551629 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-f6487bd57-qbdg4_155a3715-4600-4f83-8db3-a6beaf5c3394/manager/0.log" Jan 29 04:30:09 crc kubenswrapper[4707]: I0129 04:30:09.585480 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6bc7f4f4cf-9xlqh_6dbe27ba-a451-4202-8f58-73cb0684bfea/manager/0.log" Jan 29 04:30:09 crc kubenswrapper[4707]: I0129 04:30:09.708849 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66dfbd6f5d-jlsmf_f064b8fa-dd53-4fd8-8440-9e517b1c1279/manager/0.log" Jan 29 04:30:09 crc kubenswrapper[4707]: I0129 04:30:09.885400 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-6db5dbd896-c2sgx_3445268f-15c8-4438-8fd1-a13d2bd9981d/manager/0.log" Jan 29 04:30:10 crc kubenswrapper[4707]: I0129 04:30:10.007832 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-587c6bfdcf-c8v2v_de21d951-1d0b-415e-8923-5fa2cc58e439/manager/0.log" Jan 29 04:30:10 crc kubenswrapper[4707]: I0129 04:30:10.057203 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-z2vvp_59a9dc92-c9db-4bfa-8233-88b1690beaad/manager/0.log" Jan 29 04:30:10 crc kubenswrapper[4707]: I0129 04:30:10.292582 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-958664b5-qgj46_8255e85b-8815-4860-9325-7570ba9a6fd9/manager/0.log" Jan 29 04:30:10 crc kubenswrapper[4707]: I0129 04:30:10.607116 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-4wck9_09633ead-78c6-4934-95c2-05b24c6fc3e5/manager/0.log" Jan 29 04:30:10 crc kubenswrapper[4707]: I0129 04:30:10.638416 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6978b79747-49nhf_6fa97c9f-4b04-4795-9f11-9790c692ba0f/manager/0.log" Jan 29 04:30:10 crc kubenswrapper[4707]: I0129 04:30:10.668468 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-765668569f-jq2z4_02f283f2-5bf1-4ee7-ac34-751ffc96421c/manager/0.log" Jan 29 04:30:11 crc kubenswrapper[4707]: I0129 04:30:11.125073 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-jvdtd_55320b88-8f86-47bd-8718-6cabd0865a1c/manager/0.log" Jan 29 04:30:11 crc kubenswrapper[4707]: I0129 04:30:11.143382 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-694c5bfc85-g6dzc_7d2c1f08-0b63-4368-a7cc-9374d0dbf035/manager/0.log" Jan 29 04:30:11 crc kubenswrapper[4707]: I0129 04:30:11.244070 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:30:11 crc kubenswrapper[4707]: E0129 04:30:11.244342 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:30:11 crc kubenswrapper[4707]: I0129 04:30:11.363527 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5c765b4558-5jtr8_26d2ace7-4405-480c-acf8-233e1511007f/manager/0.log" Jan 29 04:30:11 crc kubenswrapper[4707]: I0129 04:30:11.417523 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-ddcbfd695-mrfw2_e15cd320-f902-4d99-8037-5c9355f4a833/manager/0.log" Jan 29 04:30:11 crc kubenswrapper[4707]: I0129 04:30:11.605766 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv_a6df0676-63de-4a83-bc60-9b69a2f8777f/manager/0.log" Jan 29 04:30:11 crc kubenswrapper[4707]: I0129 04:30:11.774846 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6955d4df64-rgqtz_76e29c5c-b257-48b7-953c-d7db3c6407ed/operator/0.log" Jan 29 04:30:11 crc kubenswrapper[4707]: I0129 04:30:11.966671 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-pnmmg_2553fa13-b0b4-45c7-9317-f6be21e7c1f0/registry-server/0.log" Jan 29 04:30:12 crc kubenswrapper[4707]: I0129 04:30:12.308642 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-4cp2t_e0372a1a-cd84-491e-a3d1-f58389a66b63/manager/0.log" Jan 29 04:30:12 crc kubenswrapper[4707]: I0129 04:30:12.415121 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-wblrt_f57d529d-1352-47d9-baa8-a2f383374b35/manager/0.log" Jan 29 04:30:12 crc kubenswrapper[4707]: I0129 04:30:12.674574 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-b6ggj_6fb86866-7c9d-4b4f-bf81-8a36898aca3d/operator/0.log" Jan 29 04:30:13 crc kubenswrapper[4707]: I0129 04:30:13.089160 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-zmvww_b7b5c12b-680b-4814-906c-62c9f8702559/manager/0.log" Jan 29 04:30:13 crc kubenswrapper[4707]: I0129 04:30:13.217001 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7886d5cc69-w8rzq_0a32b73c-f66f-425f-81a9-ef1cc36041d4/manager/0.log" Jan 29 04:30:13 crc kubenswrapper[4707]: I0129 04:30:13.312254 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-lnfzj_6e5e159a-c89d-43cf-b9cf-4a92de09ac22/manager/0.log" Jan 29 04:30:13 crc kubenswrapper[4707]: I0129 04:30:13.388583 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-767b8bc766-mhnm2_706ea7e5-d8b2-4bc1-900b-d62dddcad89e/manager/0.log" Jan 29 04:30:13 crc kubenswrapper[4707]: I0129 04:30:13.998644 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-cc96c49b6-x4zwn_d938abde-b4d6-4d4e-a176-9ed92ac5325d/manager/0.log" Jan 29 04:30:23 crc kubenswrapper[4707]: I0129 04:30:23.243466 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:30:23 crc kubenswrapper[4707]: E0129 04:30:23.244207 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:30:36 crc kubenswrapper[4707]: I0129 04:30:36.244217 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:30:36 crc kubenswrapper[4707]: E0129 04:30:36.245252 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:30:37 crc kubenswrapper[4707]: I0129 04:30:37.946496 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bzqlt_e8275a2b-4124-46d8-b2f1-4a7e8401e369/control-plane-machine-set-operator/0.log" Jan 29 04:30:38 crc kubenswrapper[4707]: I0129 04:30:38.155592 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mtt54_23df2202-fce8-4515-b147-1256fe6d953b/kube-rbac-proxy/0.log" Jan 29 04:30:38 crc kubenswrapper[4707]: I0129 04:30:38.166070 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mtt54_23df2202-fce8-4515-b147-1256fe6d953b/machine-api-operator/0.log" Jan 29 04:30:41 crc kubenswrapper[4707]: I0129 04:30:41.375851 4707 scope.go:117] "RemoveContainer" containerID="c77de9cdb6984bec3614db4d649beebf0d2a69ccf634e03a30a37f19f060266b" Jan 29 04:30:48 crc kubenswrapper[4707]: I0129 04:30:48.243894 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:30:48 crc kubenswrapper[4707]: E0129 04:30:48.244849 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:30:53 crc kubenswrapper[4707]: I0129 04:30:53.894528 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-khpr5_2eac8e69-1dad-4d12-bdc5-24cb7659f04d/cert-manager-controller/0.log" Jan 29 04:30:54 crc kubenswrapper[4707]: I0129 04:30:54.562861 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-h296l_ef3e8867-84ba-49f9-878e-482ae14faaa7/cert-manager-cainjector/0.log" Jan 29 04:30:54 crc kubenswrapper[4707]: I0129 04:30:54.586863 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vbkth_e32edc95-69cc-48f9-8840-a8bba34d4649/cert-manager-webhook/0.log" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.316549 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n2sqp"] Jan 29 04:30:57 crc kubenswrapper[4707]: E0129 04:30:57.318201 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120f3dcc-4481-4a2f-9cbc-373bc37af3a1" containerName="collect-profiles" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.318221 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="120f3dcc-4481-4a2f-9cbc-373bc37af3a1" containerName="collect-profiles" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.318558 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="120f3dcc-4481-4a2f-9cbc-373bc37af3a1" containerName="collect-profiles" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.320849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.325746 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2sqp"] Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.405611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-catalog-content\") pod \"redhat-marketplace-n2sqp\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.405734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-utilities\") pod \"redhat-marketplace-n2sqp\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.405781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwjwh\" (UniqueName: \"kubernetes.io/projected/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-kube-api-access-dwjwh\") pod \"redhat-marketplace-n2sqp\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.507383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-catalog-content\") pod \"redhat-marketplace-n2sqp\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.507469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-utilities\") pod \"redhat-marketplace-n2sqp\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.507519 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwjwh\" (UniqueName: \"kubernetes.io/projected/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-kube-api-access-dwjwh\") pod \"redhat-marketplace-n2sqp\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.507999 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-catalog-content\") pod \"redhat-marketplace-n2sqp\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.508303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-utilities\") pod \"redhat-marketplace-n2sqp\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.540722 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwjwh\" (UniqueName: \"kubernetes.io/projected/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-kube-api-access-dwjwh\") pod \"redhat-marketplace-n2sqp\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:57 crc kubenswrapper[4707]: I0129 04:30:57.645676 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:30:58 crc kubenswrapper[4707]: I0129 04:30:58.249300 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2sqp"] Jan 29 04:30:58 crc kubenswrapper[4707]: I0129 04:30:58.871188 4707 generic.go:334] "Generic (PLEG): container finished" podID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerID="e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3" exitCode=0 Jan 29 04:30:58 crc kubenswrapper[4707]: I0129 04:30:58.871294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2sqp" event={"ID":"3632f328-f2ec-4f4d-a46e-0efe2a6872c6","Type":"ContainerDied","Data":"e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3"} Jan 29 04:30:58 crc kubenswrapper[4707]: I0129 04:30:58.871801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2sqp" event={"ID":"3632f328-f2ec-4f4d-a46e-0efe2a6872c6","Type":"ContainerStarted","Data":"f1f20274c6d514510e0e8f9206407a7e1a7deaf7888d6c4d2f4cacb95a872b8c"} Jan 29 04:30:59 crc kubenswrapper[4707]: I0129 04:30:59.885483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2sqp" event={"ID":"3632f328-f2ec-4f4d-a46e-0efe2a6872c6","Type":"ContainerStarted","Data":"0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf"} Jan 29 04:31:00 crc kubenswrapper[4707]: I0129 04:31:00.906110 4707 generic.go:334] "Generic (PLEG): container finished" podID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerID="0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf" exitCode=0 Jan 29 04:31:00 crc kubenswrapper[4707]: I0129 04:31:00.906203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2sqp" event={"ID":"3632f328-f2ec-4f4d-a46e-0efe2a6872c6","Type":"ContainerDied","Data":"0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf"} Jan 29 04:31:01 crc kubenswrapper[4707]: I0129 04:31:01.926431 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2sqp" event={"ID":"3632f328-f2ec-4f4d-a46e-0efe2a6872c6","Type":"ContainerStarted","Data":"a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b"} Jan 29 04:31:01 crc kubenswrapper[4707]: I0129 04:31:01.963032 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n2sqp" podStartSLOduration=2.463061903 podStartE2EDuration="4.962990958s" podCreationTimestamp="2026-01-29 04:30:57 +0000 UTC" firstStartedPulling="2026-01-29 04:30:58.875007469 +0000 UTC m=+3812.359236374" lastFinishedPulling="2026-01-29 04:31:01.374936484 +0000 UTC m=+3814.859165429" observedRunningTime="2026-01-29 04:31:01.951489825 +0000 UTC m=+3815.435718730" watchObservedRunningTime="2026-01-29 04:31:01.962990958 +0000 UTC m=+3815.447219873" Jan 29 04:31:02 crc kubenswrapper[4707]: I0129 04:31:02.246054 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:31:02 crc kubenswrapper[4707]: E0129 04:31:02.246798 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:31:07 crc kubenswrapper[4707]: I0129 04:31:07.647887 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:31:07 crc kubenswrapper[4707]: I0129 04:31:07.650830 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:31:07 crc kubenswrapper[4707]: I0129 04:31:07.707761 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:31:08 crc kubenswrapper[4707]: I0129 04:31:08.069691 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:31:08 crc kubenswrapper[4707]: I0129 04:31:08.157396 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2sqp"] Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.021052 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n2sqp" podUID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerName="registry-server" containerID="cri-o://a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b" gracePeriod=2 Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.521609 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.707366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-catalog-content\") pod \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.707453 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwjwh\" (UniqueName: \"kubernetes.io/projected/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-kube-api-access-dwjwh\") pod \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.707573 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-utilities\") pod \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\" (UID: \"3632f328-f2ec-4f4d-a46e-0efe2a6872c6\") " Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.712149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-utilities" (OuterVolumeSpecName: "utilities") pod "3632f328-f2ec-4f4d-a46e-0efe2a6872c6" (UID: "3632f328-f2ec-4f4d-a46e-0efe2a6872c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.716784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-kube-api-access-dwjwh" (OuterVolumeSpecName: "kube-api-access-dwjwh") pod "3632f328-f2ec-4f4d-a46e-0efe2a6872c6" (UID: "3632f328-f2ec-4f4d-a46e-0efe2a6872c6"). InnerVolumeSpecName "kube-api-access-dwjwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.736119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3632f328-f2ec-4f4d-a46e-0efe2a6872c6" (UID: "3632f328-f2ec-4f4d-a46e-0efe2a6872c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.810877 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.810928 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwjwh\" (UniqueName: \"kubernetes.io/projected/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-kube-api-access-dwjwh\") on node \"crc\" DevicePath \"\"" Jan 29 04:31:10 crc kubenswrapper[4707]: I0129 04:31:10.810947 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3632f328-f2ec-4f4d-a46e-0efe2a6872c6-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.046060 4707 generic.go:334] "Generic (PLEG): container finished" podID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerID="a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b" exitCode=0 Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.046127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2sqp" event={"ID":"3632f328-f2ec-4f4d-a46e-0efe2a6872c6","Type":"ContainerDied","Data":"a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b"} Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.046166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2sqp" event={"ID":"3632f328-f2ec-4f4d-a46e-0efe2a6872c6","Type":"ContainerDied","Data":"f1f20274c6d514510e0e8f9206407a7e1a7deaf7888d6c4d2f4cacb95a872b8c"} Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.046199 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2sqp" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.046197 4707 scope.go:117] "RemoveContainer" containerID="a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.101659 4707 scope.go:117] "RemoveContainer" containerID="0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.107569 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2sqp"] Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.118753 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2sqp"] Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.132783 4707 scope.go:117] "RemoveContainer" containerID="e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.184965 4707 scope.go:117] "RemoveContainer" containerID="a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b" Jan 29 04:31:11 crc kubenswrapper[4707]: E0129 04:31:11.186856 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b\": container with ID starting with a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b not found: ID does not exist" containerID="a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.186897 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b"} err="failed to get container status \"a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b\": rpc error: code = NotFound desc = could not find container \"a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b\": container with ID starting with a6153bf42965e8f0f531f4fb63779b2d6f108aa87df65d72b128031c61e2993b not found: ID does not exist" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.186924 4707 scope.go:117] "RemoveContainer" containerID="0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf" Jan 29 04:31:11 crc kubenswrapper[4707]: E0129 04:31:11.187856 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf\": container with ID starting with 0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf not found: ID does not exist" containerID="0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.187889 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf"} err="failed to get container status \"0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf\": rpc error: code = NotFound desc = could not find container \"0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf\": container with ID starting with 0c3eddd1474785c8ec9cea8f6f53854cc51ccbe2d4ffca80794545f3836e48cf not found: ID does not exist" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.187906 4707 scope.go:117] "RemoveContainer" containerID="e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3" Jan 29 04:31:11 crc kubenswrapper[4707]: E0129 04:31:11.188408 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3\": container with ID starting with e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3 not found: ID does not exist" containerID="e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.188468 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3"} err="failed to get container status \"e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3\": rpc error: code = NotFound desc = could not find container \"e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3\": container with ID starting with e3329a885ec71a03b7a09530151f51300157821cf49a42e07b5d47d7677c43c3 not found: ID does not exist" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.255439 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" path="/var/lib/kubelet/pods/3632f328-f2ec-4f4d-a46e-0efe2a6872c6/volumes" Jan 29 04:31:11 crc kubenswrapper[4707]: I0129 04:31:11.315872 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-rkzwz_4657ed73-851c-43e4-9f85-e06471c81722/nmstate-console-plugin/0.log" Jan 29 04:31:12 crc kubenswrapper[4707]: I0129 04:31:12.041701 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kdqjs_966ffde7-06ec-4066-b9db-b4b1e750095f/kube-rbac-proxy/0.log" Jan 29 04:31:12 crc kubenswrapper[4707]: I0129 04:31:12.043338 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kdqjs_966ffde7-06ec-4066-b9db-b4b1e750095f/nmstate-metrics/0.log" Jan 29 04:31:12 crc kubenswrapper[4707]: I0129 04:31:12.044972 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xvfr7_35c9d599-ff9e-4713-8c0d-6e72c41f6859/nmstate-handler/0.log" Jan 29 04:31:12 crc kubenswrapper[4707]: I0129 04:31:12.239015 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-6497v_172d1247-a499-49cf-a003-3c70d059385f/nmstate-operator/0.log" Jan 29 04:31:12 crc kubenswrapper[4707]: I0129 04:31:12.282460 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-7rh4l_7eec8389-133d-412b-a2f6-813eaf6e6468/nmstate-webhook/0.log" Jan 29 04:31:17 crc kubenswrapper[4707]: I0129 04:31:17.254368 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:31:17 crc kubenswrapper[4707]: E0129 04:31:17.255835 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:31:27 crc kubenswrapper[4707]: I0129 04:31:27.890081 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-zggfz_aac893f7-df17-486c-895f-b5305b76bc60/prometheus-operator/0.log" Jan 29 04:31:28 crc kubenswrapper[4707]: I0129 04:31:28.073456 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd586f795-dhgdl_23eb4701-1c82-40e4-990b-87c4044f51cc/prometheus-operator-admission-webhook/0.log" Jan 29 04:31:28 crc kubenswrapper[4707]: I0129 04:31:28.136528 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd586f795-qmjjg_fd346f51-b69b-4ac8-b4d3-d24201dd0015/prometheus-operator-admission-webhook/0.log" Jan 29 04:31:28 crc kubenswrapper[4707]: I0129 04:31:28.299376 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-gkwm8_cb2589dc-26af-42a0-8fb9-8f908a0fbac9/operator/0.log" Jan 29 04:31:28 crc kubenswrapper[4707]: I0129 04:31:28.383760 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-b6clh_1ab8ee44-5e15-42f6-861c-071cb82c90d4/perses-operator/0.log" Jan 29 04:31:31 crc kubenswrapper[4707]: I0129 04:31:31.244802 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:31:31 crc kubenswrapper[4707]: E0129 04:31:31.246078 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:31:43 crc kubenswrapper[4707]: I0129 04:31:43.244869 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:31:43 crc kubenswrapper[4707]: E0129 04:31:43.246054 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.075087 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-r9vjq_063768b8-90c6-4b82-b3d2-13fbdc42bab5/kube-rbac-proxy/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.099439 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-r9vjq_063768b8-90c6-4b82-b3d2-13fbdc42bab5/controller/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.309569 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-frr-files/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.479484 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-reloader/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.479711 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-frr-files/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.528378 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-reloader/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.529112 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-metrics/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.719370 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-metrics/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.720266 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-frr-files/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.726292 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-reloader/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.762349 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-metrics/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.931976 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-metrics/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.932298 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-frr-files/0.log" Jan 29 04:31:44 crc kubenswrapper[4707]: I0129 04:31:44.935326 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-reloader/0.log" Jan 29 04:31:45 crc kubenswrapper[4707]: I0129 04:31:45.005242 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/controller/0.log" Jan 29 04:31:45 crc kubenswrapper[4707]: I0129 04:31:45.107051 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/frr-metrics/0.log" Jan 29 04:31:45 crc kubenswrapper[4707]: I0129 04:31:45.135470 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/kube-rbac-proxy/0.log" Jan 29 04:31:45 crc kubenswrapper[4707]: I0129 04:31:45.251387 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/kube-rbac-proxy-frr/0.log" Jan 29 04:31:45 crc kubenswrapper[4707]: I0129 04:31:45.397080 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/reloader/0.log" Jan 29 04:31:45 crc kubenswrapper[4707]: I0129 04:31:45.473691 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-4vtxc_a43d3509-ef8d-47ba-b60f-675e3113086d/frr-k8s-webhook-server/0.log" Jan 29 04:31:45 crc kubenswrapper[4707]: I0129 04:31:45.648519 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76498b594b-4z2xj_403318b7-a0b4-4a62-8094-9a2ac1127387/manager/0.log" Jan 29 04:31:45 crc kubenswrapper[4707]: I0129 04:31:45.973184 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-866946c6ff-xnqlj_126a1c1c-acae-407d-854d-fbeb74a88a9c/webhook-server/0.log" Jan 29 04:31:45 crc kubenswrapper[4707]: I0129 04:31:45.992857 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2frrd_1d738b61-6875-468d-8fdb-c0d567c8ea88/kube-rbac-proxy/0.log" Jan 29 04:31:46 crc kubenswrapper[4707]: I0129 04:31:46.854998 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2frrd_1d738b61-6875-468d-8fdb-c0d567c8ea88/speaker/0.log" Jan 29 04:31:47 crc kubenswrapper[4707]: I0129 04:31:47.066453 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/frr/0.log" Jan 29 04:31:55 crc kubenswrapper[4707]: I0129 04:31:55.245923 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:31:55 crc kubenswrapper[4707]: E0129 04:31:55.247005 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:32:02 crc kubenswrapper[4707]: I0129 04:32:02.496756 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/util/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.093122 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/pull/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.126895 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/util/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.155004 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/pull/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.303937 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/util/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.317660 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/pull/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.356189 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/extract/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.477048 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/util/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.722511 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/pull/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.778899 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/pull/0.log" Jan 29 04:32:03 crc kubenswrapper[4707]: I0129 04:32:03.786945 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/util/0.log" Jan 29 04:32:04 crc kubenswrapper[4707]: I0129 04:32:04.222178 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/util/0.log" Jan 29 04:32:04 crc kubenswrapper[4707]: I0129 04:32:04.312415 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/pull/0.log" Jan 29 04:32:04 crc kubenswrapper[4707]: I0129 04:32:04.346994 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/extract/0.log" Jan 29 04:32:04 crc kubenswrapper[4707]: I0129 04:32:04.470784 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/util/0.log" Jan 29 04:32:04 crc kubenswrapper[4707]: I0129 04:32:04.657949 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/util/0.log" Jan 29 04:32:04 crc kubenswrapper[4707]: I0129 04:32:04.666577 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/pull/0.log" Jan 29 04:32:04 crc kubenswrapper[4707]: I0129 04:32:04.713220 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/pull/0.log" Jan 29 04:32:05 crc kubenswrapper[4707]: I0129 04:32:05.427060 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/extract/0.log" Jan 29 04:32:05 crc kubenswrapper[4707]: I0129 04:32:05.461696 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/util/0.log" Jan 29 04:32:05 crc kubenswrapper[4707]: I0129 04:32:05.507330 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/pull/0.log" Jan 29 04:32:05 crc kubenswrapper[4707]: I0129 04:32:05.629929 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-utilities/0.log" Jan 29 04:32:05 crc kubenswrapper[4707]: I0129 04:32:05.835815 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-utilities/0.log" Jan 29 04:32:05 crc kubenswrapper[4707]: I0129 04:32:05.872198 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-content/0.log" Jan 29 04:32:05 crc kubenswrapper[4707]: I0129 04:32:05.895150 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-content/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.022335 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-utilities/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.045119 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-content/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.224949 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-utilities/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.493746 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-utilities/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.499572 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-content/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.502126 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-content/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.642612 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/registry-server/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.814448 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-utilities/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.896131 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-content/0.log" Jan 29 04:32:06 crc kubenswrapper[4707]: I0129 04:32:06.956888 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qwgfr_d3599142-c844-4a86-9bef-e589d69f0ef4/marketplace-operator/0.log" Jan 29 04:32:07 crc kubenswrapper[4707]: I0129 04:32:07.281854 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/registry-server/0.log" Jan 29 04:32:07 crc kubenswrapper[4707]: I0129 04:32:07.285940 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-utilities/0.log" Jan 29 04:32:07 crc kubenswrapper[4707]: I0129 04:32:07.451512 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-utilities/0.log" Jan 29 04:32:07 crc kubenswrapper[4707]: I0129 04:32:07.462673 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-content/0.log" Jan 29 04:32:07 crc kubenswrapper[4707]: I0129 04:32:07.480211 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-content/0.log" Jan 29 04:32:07 crc kubenswrapper[4707]: I0129 04:32:07.990270 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-content/0.log" Jan 29 04:32:08 crc kubenswrapper[4707]: I0129 04:32:08.025788 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-utilities/0.log" Jan 29 04:32:08 crc kubenswrapper[4707]: I0129 04:32:08.038619 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-utilities/0.log" Jan 29 04:32:08 crc kubenswrapper[4707]: I0129 04:32:08.098327 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/registry-server/0.log" Jan 29 04:32:08 crc kubenswrapper[4707]: I0129 04:32:08.286622 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-content/0.log" Jan 29 04:32:08 crc kubenswrapper[4707]: I0129 04:32:08.288047 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-utilities/0.log" Jan 29 04:32:08 crc kubenswrapper[4707]: I0129 04:32:08.289098 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-content/0.log" Jan 29 04:32:08 crc kubenswrapper[4707]: I0129 04:32:08.454341 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-utilities/0.log" Jan 29 04:32:08 crc kubenswrapper[4707]: I0129 04:32:08.488359 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-content/0.log" Jan 29 04:32:09 crc kubenswrapper[4707]: I0129 04:32:09.000315 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/registry-server/0.log" Jan 29 04:32:10 crc kubenswrapper[4707]: I0129 04:32:10.243959 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:32:10 crc kubenswrapper[4707]: E0129 04:32:10.244692 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:32:23 crc kubenswrapper[4707]: I0129 04:32:23.244513 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:32:23 crc kubenswrapper[4707]: E0129 04:32:23.245611 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:32:27 crc kubenswrapper[4707]: I0129 04:32:27.286757 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd586f795-dhgdl_23eb4701-1c82-40e4-990b-87c4044f51cc/prometheus-operator-admission-webhook/0.log" Jan 29 04:32:27 crc kubenswrapper[4707]: I0129 04:32:27.323562 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-zggfz_aac893f7-df17-486c-895f-b5305b76bc60/prometheus-operator/0.log" Jan 29 04:32:27 crc kubenswrapper[4707]: I0129 04:32:27.343633 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd586f795-qmjjg_fd346f51-b69b-4ac8-b4d3-d24201dd0015/prometheus-operator-admission-webhook/0.log" Jan 29 04:32:27 crc kubenswrapper[4707]: I0129 04:32:27.800497 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-gkwm8_cb2589dc-26af-42a0-8fb9-8f908a0fbac9/operator/0.log" Jan 29 04:32:27 crc kubenswrapper[4707]: I0129 04:32:27.860207 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-b6clh_1ab8ee44-5e15-42f6-861c-071cb82c90d4/perses-operator/0.log" Jan 29 04:32:36 crc kubenswrapper[4707]: I0129 04:32:36.243822 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:32:37 crc kubenswrapper[4707]: I0129 04:32:37.112651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"51f9ca91f11952f12ffda104ffac66609fac364d355d6649b07810cf32060d88"} Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.743417 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qjtll"] Jan 29 04:32:50 crc kubenswrapper[4707]: E0129 04:32:50.745043 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerName="extract-utilities" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.745068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerName="extract-utilities" Jan 29 04:32:50 crc kubenswrapper[4707]: E0129 04:32:50.745112 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerName="extract-content" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.745121 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerName="extract-content" Jan 29 04:32:50 crc kubenswrapper[4707]: E0129 04:32:50.745151 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerName="registry-server" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.745160 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerName="registry-server" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.745519 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3632f328-f2ec-4f4d-a46e-0efe2a6872c6" containerName="registry-server" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.752837 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.783853 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qjtll"] Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.834926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-utilities\") pod \"redhat-operators-qjtll\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.835149 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-catalog-content\") pod \"redhat-operators-qjtll\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.835218 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs6xv\" (UniqueName: \"kubernetes.io/projected/df392c98-0714-45a2-9de7-85f7280f69a7-kube-api-access-qs6xv\") pod \"redhat-operators-qjtll\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.938724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-utilities\") pod \"redhat-operators-qjtll\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.938886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-catalog-content\") pod \"redhat-operators-qjtll\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.938967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs6xv\" (UniqueName: \"kubernetes.io/projected/df392c98-0714-45a2-9de7-85f7280f69a7-kube-api-access-qs6xv\") pod \"redhat-operators-qjtll\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.939221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-utilities\") pod \"redhat-operators-qjtll\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.939665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-catalog-content\") pod \"redhat-operators-qjtll\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:50 crc kubenswrapper[4707]: I0129 04:32:50.971677 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs6xv\" (UniqueName: \"kubernetes.io/projected/df392c98-0714-45a2-9de7-85f7280f69a7-kube-api-access-qs6xv\") pod \"redhat-operators-qjtll\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:51 crc kubenswrapper[4707]: I0129 04:32:51.086891 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:32:51 crc kubenswrapper[4707]: I0129 04:32:51.696186 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qjtll"] Jan 29 04:32:52 crc kubenswrapper[4707]: I0129 04:32:52.311304 4707 generic.go:334] "Generic (PLEG): container finished" podID="df392c98-0714-45a2-9de7-85f7280f69a7" containerID="73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e" exitCode=0 Jan 29 04:32:52 crc kubenswrapper[4707]: I0129 04:32:52.311424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjtll" event={"ID":"df392c98-0714-45a2-9de7-85f7280f69a7","Type":"ContainerDied","Data":"73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e"} Jan 29 04:32:52 crc kubenswrapper[4707]: I0129 04:32:52.311736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjtll" event={"ID":"df392c98-0714-45a2-9de7-85f7280f69a7","Type":"ContainerStarted","Data":"687a93c055dfb2f7d175936788367a690e8a8aac433bbff8db68b8bdb353eba9"} Jan 29 04:32:52 crc kubenswrapper[4707]: I0129 04:32:52.316049 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 04:32:53 crc kubenswrapper[4707]: I0129 04:32:53.325985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjtll" event={"ID":"df392c98-0714-45a2-9de7-85f7280f69a7","Type":"ContainerStarted","Data":"0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a"} Jan 29 04:32:57 crc kubenswrapper[4707]: I0129 04:32:57.384441 4707 generic.go:334] "Generic (PLEG): container finished" podID="df392c98-0714-45a2-9de7-85f7280f69a7" containerID="0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a" exitCode=0 Jan 29 04:32:57 crc kubenswrapper[4707]: I0129 04:32:57.384501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjtll" event={"ID":"df392c98-0714-45a2-9de7-85f7280f69a7","Type":"ContainerDied","Data":"0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a"} Jan 29 04:32:58 crc kubenswrapper[4707]: I0129 04:32:58.397570 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjtll" event={"ID":"df392c98-0714-45a2-9de7-85f7280f69a7","Type":"ContainerStarted","Data":"0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243"} Jan 29 04:32:58 crc kubenswrapper[4707]: I0129 04:32:58.430479 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qjtll" podStartSLOduration=2.90056581 podStartE2EDuration="8.430428711s" podCreationTimestamp="2026-01-29 04:32:50 +0000 UTC" firstStartedPulling="2026-01-29 04:32:52.315794598 +0000 UTC m=+3925.800023503" lastFinishedPulling="2026-01-29 04:32:57.845657479 +0000 UTC m=+3931.329886404" observedRunningTime="2026-01-29 04:32:58.423710522 +0000 UTC m=+3931.907939447" watchObservedRunningTime="2026-01-29 04:32:58.430428711 +0000 UTC m=+3931.914657616" Jan 29 04:33:01 crc kubenswrapper[4707]: I0129 04:33:01.087429 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:33:01 crc kubenswrapper[4707]: I0129 04:33:01.087772 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:33:02 crc kubenswrapper[4707]: I0129 04:33:02.147829 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qjtll" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" containerName="registry-server" probeResult="failure" output=< Jan 29 04:33:02 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 04:33:02 crc kubenswrapper[4707]: > Jan 29 04:33:11 crc kubenswrapper[4707]: I0129 04:33:11.146759 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:33:11 crc kubenswrapper[4707]: I0129 04:33:11.206211 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:33:11 crc kubenswrapper[4707]: I0129 04:33:11.390288 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qjtll"] Jan 29 04:33:12 crc kubenswrapper[4707]: I0129 04:33:12.554672 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qjtll" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" containerName="registry-server" containerID="cri-o://0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243" gracePeriod=2 Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.042089 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.066656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs6xv\" (UniqueName: \"kubernetes.io/projected/df392c98-0714-45a2-9de7-85f7280f69a7-kube-api-access-qs6xv\") pod \"df392c98-0714-45a2-9de7-85f7280f69a7\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.066766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-catalog-content\") pod \"df392c98-0714-45a2-9de7-85f7280f69a7\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.066806 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-utilities\") pod \"df392c98-0714-45a2-9de7-85f7280f69a7\" (UID: \"df392c98-0714-45a2-9de7-85f7280f69a7\") " Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.067700 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-utilities" (OuterVolumeSpecName: "utilities") pod "df392c98-0714-45a2-9de7-85f7280f69a7" (UID: "df392c98-0714-45a2-9de7-85f7280f69a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.123973 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df392c98-0714-45a2-9de7-85f7280f69a7-kube-api-access-qs6xv" (OuterVolumeSpecName: "kube-api-access-qs6xv") pod "df392c98-0714-45a2-9de7-85f7280f69a7" (UID: "df392c98-0714-45a2-9de7-85f7280f69a7"). InnerVolumeSpecName "kube-api-access-qs6xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.169353 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs6xv\" (UniqueName: \"kubernetes.io/projected/df392c98-0714-45a2-9de7-85f7280f69a7-kube-api-access-qs6xv\") on node \"crc\" DevicePath \"\"" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.169413 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.252302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df392c98-0714-45a2-9de7-85f7280f69a7" (UID: "df392c98-0714-45a2-9de7-85f7280f69a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.274810 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df392c98-0714-45a2-9de7-85f7280f69a7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.571194 4707 generic.go:334] "Generic (PLEG): container finished" podID="df392c98-0714-45a2-9de7-85f7280f69a7" containerID="0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243" exitCode=0 Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.571306 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qjtll" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.572813 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjtll" event={"ID":"df392c98-0714-45a2-9de7-85f7280f69a7","Type":"ContainerDied","Data":"0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243"} Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.573102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qjtll" event={"ID":"df392c98-0714-45a2-9de7-85f7280f69a7","Type":"ContainerDied","Data":"687a93c055dfb2f7d175936788367a690e8a8aac433bbff8db68b8bdb353eba9"} Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.573173 4707 scope.go:117] "RemoveContainer" containerID="0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.608704 4707 scope.go:117] "RemoveContainer" containerID="0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.617856 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qjtll"] Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.632588 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qjtll"] Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.651769 4707 scope.go:117] "RemoveContainer" containerID="73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.694221 4707 scope.go:117] "RemoveContainer" containerID="0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243" Jan 29 04:33:13 crc kubenswrapper[4707]: E0129 04:33:13.695453 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243\": container with ID starting with 0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243 not found: ID does not exist" containerID="0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.695496 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243"} err="failed to get container status \"0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243\": rpc error: code = NotFound desc = could not find container \"0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243\": container with ID starting with 0aa6057e1f0376fea3ceffc16b1815c613f4c722f5ea77836914420d33d3d243 not found: ID does not exist" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.695527 4707 scope.go:117] "RemoveContainer" containerID="0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a" Jan 29 04:33:13 crc kubenswrapper[4707]: E0129 04:33:13.696206 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a\": container with ID starting with 0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a not found: ID does not exist" containerID="0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.696305 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a"} err="failed to get container status \"0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a\": rpc error: code = NotFound desc = could not find container \"0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a\": container with ID starting with 0c6129367972b8d0238c53f24fb5b878e90095338fa218a3ae5b5b0e79521a4a not found: ID does not exist" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.696371 4707 scope.go:117] "RemoveContainer" containerID="73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e" Jan 29 04:33:13 crc kubenswrapper[4707]: E0129 04:33:13.696872 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e\": container with ID starting with 73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e not found: ID does not exist" containerID="73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e" Jan 29 04:33:13 crc kubenswrapper[4707]: I0129 04:33:13.696909 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e"} err="failed to get container status \"73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e\": rpc error: code = NotFound desc = could not find container \"73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e\": container with ID starting with 73e97c60440629c439a8d6d5e5c96ffd6bac92d226c89b85eb3be17f33f4f06e not found: ID does not exist" Jan 29 04:33:15 crc kubenswrapper[4707]: I0129 04:33:15.260584 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" path="/var/lib/kubelet/pods/df392c98-0714-45a2-9de7-85f7280f69a7/volumes" Jan 29 04:33:51 crc kubenswrapper[4707]: I0129 04:33:51.937367 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fr5nd"] Jan 29 04:33:51 crc kubenswrapper[4707]: E0129 04:33:51.938602 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" containerName="extract-content" Jan 29 04:33:51 crc kubenswrapper[4707]: I0129 04:33:51.938621 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" containerName="extract-content" Jan 29 04:33:51 crc kubenswrapper[4707]: E0129 04:33:51.938645 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" containerName="extract-utilities" Jan 29 04:33:51 crc kubenswrapper[4707]: I0129 04:33:51.938655 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" containerName="extract-utilities" Jan 29 04:33:51 crc kubenswrapper[4707]: E0129 04:33:51.938689 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" containerName="registry-server" Jan 29 04:33:51 crc kubenswrapper[4707]: I0129 04:33:51.938697 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" containerName="registry-server" Jan 29 04:33:51 crc kubenswrapper[4707]: I0129 04:33:51.939051 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="df392c98-0714-45a2-9de7-85f7280f69a7" containerName="registry-server" Jan 29 04:33:51 crc kubenswrapper[4707]: I0129 04:33:51.941780 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:51 crc kubenswrapper[4707]: I0129 04:33:51.961427 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fr5nd"] Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.077840 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-utilities\") pod \"community-operators-fr5nd\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.078064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-catalog-content\") pod \"community-operators-fr5nd\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.078256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wfpr\" (UniqueName: \"kubernetes.io/projected/26c27af4-69fa-4741-8a2d-06b15aa288b5-kube-api-access-4wfpr\") pod \"community-operators-fr5nd\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.180308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-catalog-content\") pod \"community-operators-fr5nd\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.180386 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wfpr\" (UniqueName: \"kubernetes.io/projected/26c27af4-69fa-4741-8a2d-06b15aa288b5-kube-api-access-4wfpr\") pod \"community-operators-fr5nd\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.180482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-utilities\") pod \"community-operators-fr5nd\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.181038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-catalog-content\") pod \"community-operators-fr5nd\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.181126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-utilities\") pod \"community-operators-fr5nd\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.244548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wfpr\" (UniqueName: \"kubernetes.io/projected/26c27af4-69fa-4741-8a2d-06b15aa288b5-kube-api-access-4wfpr\") pod \"community-operators-fr5nd\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.263959 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.903551 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fr5nd"] Jan 29 04:33:52 crc kubenswrapper[4707]: I0129 04:33:52.994034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr5nd" event={"ID":"26c27af4-69fa-4741-8a2d-06b15aa288b5","Type":"ContainerStarted","Data":"e4b9b3c59528ccb39b770c4d7304e0df34876e95c8d0427d21ac53d70aab9992"} Jan 29 04:33:54 crc kubenswrapper[4707]: I0129 04:33:54.005692 4707 generic.go:334] "Generic (PLEG): container finished" podID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerID="ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26" exitCode=0 Jan 29 04:33:54 crc kubenswrapper[4707]: I0129 04:33:54.005852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr5nd" event={"ID":"26c27af4-69fa-4741-8a2d-06b15aa288b5","Type":"ContainerDied","Data":"ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26"} Jan 29 04:33:56 crc kubenswrapper[4707]: I0129 04:33:56.038527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr5nd" event={"ID":"26c27af4-69fa-4741-8a2d-06b15aa288b5","Type":"ContainerStarted","Data":"707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9"} Jan 29 04:33:57 crc kubenswrapper[4707]: I0129 04:33:57.071212 4707 generic.go:334] "Generic (PLEG): container finished" podID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerID="707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9" exitCode=0 Jan 29 04:33:57 crc kubenswrapper[4707]: I0129 04:33:57.071680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr5nd" event={"ID":"26c27af4-69fa-4741-8a2d-06b15aa288b5","Type":"ContainerDied","Data":"707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9"} Jan 29 04:33:58 crc kubenswrapper[4707]: I0129 04:33:58.083583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr5nd" event={"ID":"26c27af4-69fa-4741-8a2d-06b15aa288b5","Type":"ContainerStarted","Data":"9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1"} Jan 29 04:33:58 crc kubenswrapper[4707]: I0129 04:33:58.113672 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fr5nd" podStartSLOduration=3.532788647 podStartE2EDuration="7.113652886s" podCreationTimestamp="2026-01-29 04:33:51 +0000 UTC" firstStartedPulling="2026-01-29 04:33:54.008055792 +0000 UTC m=+3987.492284697" lastFinishedPulling="2026-01-29 04:33:57.588920031 +0000 UTC m=+3991.073148936" observedRunningTime="2026-01-29 04:33:58.108955584 +0000 UTC m=+3991.593184489" watchObservedRunningTime="2026-01-29 04:33:58.113652886 +0000 UTC m=+3991.597881801" Jan 29 04:34:02 crc kubenswrapper[4707]: I0129 04:34:02.265280 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:34:02 crc kubenswrapper[4707]: I0129 04:34:02.265879 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:34:02 crc kubenswrapper[4707]: I0129 04:34:02.323073 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:34:03 crc kubenswrapper[4707]: I0129 04:34:03.219517 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:34:03 crc kubenswrapper[4707]: I0129 04:34:03.284800 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fr5nd"] Jan 29 04:34:05 crc kubenswrapper[4707]: I0129 04:34:05.159087 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fr5nd" podUID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerName="registry-server" containerID="cri-o://9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1" gracePeriod=2 Jan 29 04:34:05 crc kubenswrapper[4707]: I0129 04:34:05.859637 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:34:05 crc kubenswrapper[4707]: I0129 04:34:05.924990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-utilities\") pod \"26c27af4-69fa-4741-8a2d-06b15aa288b5\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " Jan 29 04:34:05 crc kubenswrapper[4707]: I0129 04:34:05.925252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-catalog-content\") pod \"26c27af4-69fa-4741-8a2d-06b15aa288b5\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " Jan 29 04:34:05 crc kubenswrapper[4707]: I0129 04:34:05.925295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wfpr\" (UniqueName: \"kubernetes.io/projected/26c27af4-69fa-4741-8a2d-06b15aa288b5-kube-api-access-4wfpr\") pod \"26c27af4-69fa-4741-8a2d-06b15aa288b5\" (UID: \"26c27af4-69fa-4741-8a2d-06b15aa288b5\") " Jan 29 04:34:05 crc kubenswrapper[4707]: I0129 04:34:05.926317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-utilities" (OuterVolumeSpecName: "utilities") pod "26c27af4-69fa-4741-8a2d-06b15aa288b5" (UID: "26c27af4-69fa-4741-8a2d-06b15aa288b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:34:05 crc kubenswrapper[4707]: I0129 04:34:05.926697 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:34:05 crc kubenswrapper[4707]: I0129 04:34:05.948292 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c27af4-69fa-4741-8a2d-06b15aa288b5-kube-api-access-4wfpr" (OuterVolumeSpecName: "kube-api-access-4wfpr") pod "26c27af4-69fa-4741-8a2d-06b15aa288b5" (UID: "26c27af4-69fa-4741-8a2d-06b15aa288b5"). InnerVolumeSpecName "kube-api-access-4wfpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:34:05 crc kubenswrapper[4707]: I0129 04:34:05.996655 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26c27af4-69fa-4741-8a2d-06b15aa288b5" (UID: "26c27af4-69fa-4741-8a2d-06b15aa288b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.028970 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c27af4-69fa-4741-8a2d-06b15aa288b5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.029047 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wfpr\" (UniqueName: \"kubernetes.io/projected/26c27af4-69fa-4741-8a2d-06b15aa288b5-kube-api-access-4wfpr\") on node \"crc\" DevicePath \"\"" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.172004 4707 generic.go:334] "Generic (PLEG): container finished" podID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerID="9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1" exitCode=0 Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.172053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr5nd" event={"ID":"26c27af4-69fa-4741-8a2d-06b15aa288b5","Type":"ContainerDied","Data":"9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1"} Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.172089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fr5nd" event={"ID":"26c27af4-69fa-4741-8a2d-06b15aa288b5","Type":"ContainerDied","Data":"e4b9b3c59528ccb39b770c4d7304e0df34876e95c8d0427d21ac53d70aab9992"} Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.172106 4707 scope.go:117] "RemoveContainer" containerID="9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.172189 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fr5nd" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.194379 4707 scope.go:117] "RemoveContainer" containerID="707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.220011 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fr5nd"] Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.233879 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fr5nd"] Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.252348 4707 scope.go:117] "RemoveContainer" containerID="ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.278306 4707 scope.go:117] "RemoveContainer" containerID="9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1" Jan 29 04:34:06 crc kubenswrapper[4707]: E0129 04:34:06.278840 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1\": container with ID starting with 9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1 not found: ID does not exist" containerID="9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.278944 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1"} err="failed to get container status \"9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1\": rpc error: code = NotFound desc = could not find container \"9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1\": container with ID starting with 9e83098c5dfe80c6a9bf8e3f377e7f4934765c7e03e584b4fb3db2eb568f5ea1 not found: ID does not exist" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.279115 4707 scope.go:117] "RemoveContainer" containerID="707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9" Jan 29 04:34:06 crc kubenswrapper[4707]: E0129 04:34:06.279580 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9\": container with ID starting with 707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9 not found: ID does not exist" containerID="707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.279631 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9"} err="failed to get container status \"707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9\": rpc error: code = NotFound desc = could not find container \"707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9\": container with ID starting with 707c6532ff78430d7d5073d8bd9fd19a3302cf3e16cb804ce202b5cc785198b9 not found: ID does not exist" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.279666 4707 scope.go:117] "RemoveContainer" containerID="ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26" Jan 29 04:34:06 crc kubenswrapper[4707]: E0129 04:34:06.279965 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26\": container with ID starting with ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26 not found: ID does not exist" containerID="ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26" Jan 29 04:34:06 crc kubenswrapper[4707]: I0129 04:34:06.279999 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26"} err="failed to get container status \"ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26\": rpc error: code = NotFound desc = could not find container \"ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26\": container with ID starting with ceec555ce717aea35c3d25ad3540836f47fd6f16f906aea32a33d7bb60490c26 not found: ID does not exist" Jan 29 04:34:07 crc kubenswrapper[4707]: I0129 04:34:07.255948 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c27af4-69fa-4741-8a2d-06b15aa288b5" path="/var/lib/kubelet/pods/26c27af4-69fa-4741-8a2d-06b15aa288b5/volumes" Jan 29 04:34:16 crc kubenswrapper[4707]: I0129 04:34:16.292591 4707 generic.go:334] "Generic (PLEG): container finished" podID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" containerID="f1394075470c15b524dac2a37cdebd724b5aebc63ce6136c819249b730e8e058" exitCode=0 Jan 29 04:34:16 crc kubenswrapper[4707]: I0129 04:34:16.292666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4k5/must-gather-786t8" event={"ID":"dfd22e26-c073-47dd-b0b4-4d58d6f93522","Type":"ContainerDied","Data":"f1394075470c15b524dac2a37cdebd724b5aebc63ce6136c819249b730e8e058"} Jan 29 04:34:16 crc kubenswrapper[4707]: I0129 04:34:16.294026 4707 scope.go:117] "RemoveContainer" containerID="f1394075470c15b524dac2a37cdebd724b5aebc63ce6136c819249b730e8e058" Jan 29 04:34:17 crc kubenswrapper[4707]: I0129 04:34:17.121003 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4k4k5_must-gather-786t8_dfd22e26-c073-47dd-b0b4-4d58d6f93522/gather/0.log" Jan 29 04:34:25 crc kubenswrapper[4707]: I0129 04:34:25.785241 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4k5/must-gather-786t8"] Jan 29 04:34:25 crc kubenswrapper[4707]: I0129 04:34:25.786087 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4k4k5/must-gather-786t8" podUID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" containerName="copy" containerID="cri-o://1bb7fe194150d2d673f3902769beef3264893fd96c3a6793d754d049e5cb1813" gracePeriod=2 Jan 29 04:34:25 crc kubenswrapper[4707]: I0129 04:34:25.794306 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4k5/must-gather-786t8"] Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.453921 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4k4k5_must-gather-786t8_dfd22e26-c073-47dd-b0b4-4d58d6f93522/copy/0.log" Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.454085 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4k4k5_must-gather-786t8_dfd22e26-c073-47dd-b0b4-4d58d6f93522/copy/0.log" Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.454806 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.454814 4707 generic.go:334] "Generic (PLEG): container finished" podID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" containerID="1bb7fe194150d2d673f3902769beef3264893fd96c3a6793d754d049e5cb1813" exitCode=143 Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.454867 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="660f5e809f6c9f3396452f267f11c769c86b6e49efe35de20fc49e0f486851ff" Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.615892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ptr5\" (UniqueName: \"kubernetes.io/projected/dfd22e26-c073-47dd-b0b4-4d58d6f93522-kube-api-access-7ptr5\") pod \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\" (UID: \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\") " Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.616262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfd22e26-c073-47dd-b0b4-4d58d6f93522-must-gather-output\") pod \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\" (UID: \"dfd22e26-c073-47dd-b0b4-4d58d6f93522\") " Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.623229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd22e26-c073-47dd-b0b4-4d58d6f93522-kube-api-access-7ptr5" (OuterVolumeSpecName: "kube-api-access-7ptr5") pod "dfd22e26-c073-47dd-b0b4-4d58d6f93522" (UID: "dfd22e26-c073-47dd-b0b4-4d58d6f93522"). InnerVolumeSpecName "kube-api-access-7ptr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.720700 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ptr5\" (UniqueName: \"kubernetes.io/projected/dfd22e26-c073-47dd-b0b4-4d58d6f93522-kube-api-access-7ptr5\") on node \"crc\" DevicePath \"\"" Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.786007 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd22e26-c073-47dd-b0b4-4d58d6f93522-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dfd22e26-c073-47dd-b0b4-4d58d6f93522" (UID: "dfd22e26-c073-47dd-b0b4-4d58d6f93522"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:34:26 crc kubenswrapper[4707]: I0129 04:34:26.822674 4707 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfd22e26-c073-47dd-b0b4-4d58d6f93522-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 29 04:34:27 crc kubenswrapper[4707]: I0129 04:34:27.256492 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" path="/var/lib/kubelet/pods/dfd22e26-c073-47dd-b0b4-4d58d6f93522/volumes" Jan 29 04:34:27 crc kubenswrapper[4707]: I0129 04:34:27.467112 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4k5/must-gather-786t8" Jan 29 04:34:30 crc kubenswrapper[4707]: E0129 04:34:30.382836 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice/crio-660f5e809f6c9f3396452f267f11c769c86b6e49efe35de20fc49e0f486851ff\": RecentStats: unable to find data in memory cache]" Jan 29 04:34:40 crc kubenswrapper[4707]: E0129 04:34:40.642956 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice/crio-660f5e809f6c9f3396452f267f11c769c86b6e49efe35de20fc49e0f486851ff\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice\": RecentStats: unable to find data in memory cache]" Jan 29 04:34:41 crc kubenswrapper[4707]: I0129 04:34:41.576042 4707 scope.go:117] "RemoveContainer" containerID="1bb7fe194150d2d673f3902769beef3264893fd96c3a6793d754d049e5cb1813" Jan 29 04:34:41 crc kubenswrapper[4707]: I0129 04:34:41.606946 4707 scope.go:117] "RemoveContainer" containerID="8cf0295a37fd4f1befa905f6290641286c9c1527dc499c077bce0277a0a837ba" Jan 29 04:34:41 crc kubenswrapper[4707]: I0129 04:34:41.647563 4707 scope.go:117] "RemoveContainer" containerID="f1394075470c15b524dac2a37cdebd724b5aebc63ce6136c819249b730e8e058" Jan 29 04:34:41 crc kubenswrapper[4707]: I0129 04:34:41.750959 4707 scope.go:117] "RemoveContainer" containerID="d7a338f1c2ec18a7052b453e14851da48c1f9e3f59c0e7551a22d3b0ac43343d" Jan 29 04:34:50 crc kubenswrapper[4707]: E0129 04:34:50.887470 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice/crio-660f5e809f6c9f3396452f267f11c769c86b6e49efe35de20fc49e0f486851ff\": RecentStats: unable to find data in memory cache]" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.746500 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xb9bn"] Jan 29 04:34:59 crc kubenswrapper[4707]: E0129 04:34:59.747616 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" containerName="copy" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.747634 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" containerName="copy" Jan 29 04:34:59 crc kubenswrapper[4707]: E0129 04:34:59.747654 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerName="extract-utilities" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.747663 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerName="extract-utilities" Jan 29 04:34:59 crc kubenswrapper[4707]: E0129 04:34:59.747686 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerName="extract-content" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.747692 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerName="extract-content" Jan 29 04:34:59 crc kubenswrapper[4707]: E0129 04:34:59.747706 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" containerName="gather" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.747712 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" containerName="gather" Jan 29 04:34:59 crc kubenswrapper[4707]: E0129 04:34:59.747729 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerName="registry-server" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.747735 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerName="registry-server" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.747917 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" containerName="copy" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.747934 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c27af4-69fa-4741-8a2d-06b15aa288b5" containerName="registry-server" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.747956 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd22e26-c073-47dd-b0b4-4d58d6f93522" containerName="gather" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.750038 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.757589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-catalog-content\") pod \"certified-operators-xb9bn\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.757685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/cfffc755-f95a-4e37-9714-adc7666571c5-kube-api-access-4nf2t\") pod \"certified-operators-xb9bn\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.757723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-utilities\") pod \"certified-operators-xb9bn\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.759909 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xb9bn"] Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.858716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-catalog-content\") pod \"certified-operators-xb9bn\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.858793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/cfffc755-f95a-4e37-9714-adc7666571c5-kube-api-access-4nf2t\") pod \"certified-operators-xb9bn\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.858827 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-utilities\") pod \"certified-operators-xb9bn\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.859261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-utilities\") pod \"certified-operators-xb9bn\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.859476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-catalog-content\") pod \"certified-operators-xb9bn\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:34:59 crc kubenswrapper[4707]: I0129 04:34:59.881231 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/cfffc755-f95a-4e37-9714-adc7666571c5-kube-api-access-4nf2t\") pod \"certified-operators-xb9bn\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:35:00 crc kubenswrapper[4707]: I0129 04:35:00.118559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:35:00 crc kubenswrapper[4707]: I0129 04:35:00.644255 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xb9bn"] Jan 29 04:35:00 crc kubenswrapper[4707]: W0129 04:35:00.655481 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfffc755_f95a_4e37_9714_adc7666571c5.slice/crio-6aa3898271cdf86b0a0d4662f91d843b0ff40afd04d9b3cd0b0cfbb76184828d WatchSource:0}: Error finding container 6aa3898271cdf86b0a0d4662f91d843b0ff40afd04d9b3cd0b0cfbb76184828d: Status 404 returned error can't find the container with id 6aa3898271cdf86b0a0d4662f91d843b0ff40afd04d9b3cd0b0cfbb76184828d Jan 29 04:35:00 crc kubenswrapper[4707]: I0129 04:35:00.986072 4707 generic.go:334] "Generic (PLEG): container finished" podID="cfffc755-f95a-4e37-9714-adc7666571c5" containerID="7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3" exitCode=0 Jan 29 04:35:00 crc kubenswrapper[4707]: I0129 04:35:00.986205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb9bn" event={"ID":"cfffc755-f95a-4e37-9714-adc7666571c5","Type":"ContainerDied","Data":"7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3"} Jan 29 04:35:00 crc kubenswrapper[4707]: I0129 04:35:00.986592 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb9bn" event={"ID":"cfffc755-f95a-4e37-9714-adc7666571c5","Type":"ContainerStarted","Data":"6aa3898271cdf86b0a0d4662f91d843b0ff40afd04d9b3cd0b0cfbb76184828d"} Jan 29 04:35:01 crc kubenswrapper[4707]: E0129 04:35:01.170212 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice/crio-660f5e809f6c9f3396452f267f11c769c86b6e49efe35de20fc49e0f486851ff\": RecentStats: unable to find data in memory cache]" Jan 29 04:35:02 crc kubenswrapper[4707]: I0129 04:35:02.001968 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb9bn" event={"ID":"cfffc755-f95a-4e37-9714-adc7666571c5","Type":"ContainerStarted","Data":"eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b"} Jan 29 04:35:03 crc kubenswrapper[4707]: I0129 04:35:03.462817 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:35:03 crc kubenswrapper[4707]: I0129 04:35:03.463186 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:35:04 crc kubenswrapper[4707]: I0129 04:35:04.026403 4707 generic.go:334] "Generic (PLEG): container finished" podID="cfffc755-f95a-4e37-9714-adc7666571c5" containerID="eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b" exitCode=0 Jan 29 04:35:04 crc kubenswrapper[4707]: I0129 04:35:04.026463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb9bn" event={"ID":"cfffc755-f95a-4e37-9714-adc7666571c5","Type":"ContainerDied","Data":"eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b"} Jan 29 04:35:05 crc kubenswrapper[4707]: I0129 04:35:05.037709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb9bn" event={"ID":"cfffc755-f95a-4e37-9714-adc7666571c5","Type":"ContainerStarted","Data":"67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41"} Jan 29 04:35:05 crc kubenswrapper[4707]: I0129 04:35:05.057605 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xb9bn" podStartSLOduration=2.609438667 podStartE2EDuration="6.05758843s" podCreationTimestamp="2026-01-29 04:34:59 +0000 UTC" firstStartedPulling="2026-01-29 04:35:00.98817574 +0000 UTC m=+4054.472404685" lastFinishedPulling="2026-01-29 04:35:04.436325543 +0000 UTC m=+4057.920554448" observedRunningTime="2026-01-29 04:35:05.054552644 +0000 UTC m=+4058.538781549" watchObservedRunningTime="2026-01-29 04:35:05.05758843 +0000 UTC m=+4058.541817335" Jan 29 04:35:10 crc kubenswrapper[4707]: I0129 04:35:10.119155 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:35:10 crc kubenswrapper[4707]: I0129 04:35:10.119756 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:35:10 crc kubenswrapper[4707]: I0129 04:35:10.168037 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:35:11 crc kubenswrapper[4707]: I0129 04:35:11.183338 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:35:11 crc kubenswrapper[4707]: I0129 04:35:11.242438 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xb9bn"] Jan 29 04:35:11 crc kubenswrapper[4707]: E0129 04:35:11.448950 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice/crio-660f5e809f6c9f3396452f267f11c769c86b6e49efe35de20fc49e0f486851ff\": RecentStats: unable to find data in memory cache]" Jan 29 04:35:13 crc kubenswrapper[4707]: I0129 04:35:13.124127 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xb9bn" podUID="cfffc755-f95a-4e37-9714-adc7666571c5" containerName="registry-server" containerID="cri-o://67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41" gracePeriod=2 Jan 29 04:35:13 crc kubenswrapper[4707]: I0129 04:35:13.729901 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:35:13 crc kubenswrapper[4707]: I0129 04:35:13.928767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-catalog-content\") pod \"cfffc755-f95a-4e37-9714-adc7666571c5\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " Jan 29 04:35:13 crc kubenswrapper[4707]: I0129 04:35:13.929108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/cfffc755-f95a-4e37-9714-adc7666571c5-kube-api-access-4nf2t\") pod \"cfffc755-f95a-4e37-9714-adc7666571c5\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " Jan 29 04:35:13 crc kubenswrapper[4707]: I0129 04:35:13.929199 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-utilities\") pod \"cfffc755-f95a-4e37-9714-adc7666571c5\" (UID: \"cfffc755-f95a-4e37-9714-adc7666571c5\") " Jan 29 04:35:13 crc kubenswrapper[4707]: I0129 04:35:13.930347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-utilities" (OuterVolumeSpecName: "utilities") pod "cfffc755-f95a-4e37-9714-adc7666571c5" (UID: "cfffc755-f95a-4e37-9714-adc7666571c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:35:13 crc kubenswrapper[4707]: I0129 04:35:13.938961 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfffc755-f95a-4e37-9714-adc7666571c5-kube-api-access-4nf2t" (OuterVolumeSpecName: "kube-api-access-4nf2t") pod "cfffc755-f95a-4e37-9714-adc7666571c5" (UID: "cfffc755-f95a-4e37-9714-adc7666571c5"). InnerVolumeSpecName "kube-api-access-4nf2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:35:13 crc kubenswrapper[4707]: I0129 04:35:13.979866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfffc755-f95a-4e37-9714-adc7666571c5" (UID: "cfffc755-f95a-4e37-9714-adc7666571c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.032404 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nf2t\" (UniqueName: \"kubernetes.io/projected/cfffc755-f95a-4e37-9714-adc7666571c5-kube-api-access-4nf2t\") on node \"crc\" DevicePath \"\"" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.032440 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.032454 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfffc755-f95a-4e37-9714-adc7666571c5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.136829 4707 generic.go:334] "Generic (PLEG): container finished" podID="cfffc755-f95a-4e37-9714-adc7666571c5" containerID="67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41" exitCode=0 Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.136880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb9bn" event={"ID":"cfffc755-f95a-4e37-9714-adc7666571c5","Type":"ContainerDied","Data":"67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41"} Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.136917 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xb9bn" event={"ID":"cfffc755-f95a-4e37-9714-adc7666571c5","Type":"ContainerDied","Data":"6aa3898271cdf86b0a0d4662f91d843b0ff40afd04d9b3cd0b0cfbb76184828d"} Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.136932 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xb9bn" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.136960 4707 scope.go:117] "RemoveContainer" containerID="67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.163329 4707 scope.go:117] "RemoveContainer" containerID="eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.183649 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xb9bn"] Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.194889 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xb9bn"] Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.199157 4707 scope.go:117] "RemoveContainer" containerID="7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.245872 4707 scope.go:117] "RemoveContainer" containerID="67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41" Jan 29 04:35:14 crc kubenswrapper[4707]: E0129 04:35:14.246864 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41\": container with ID starting with 67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41 not found: ID does not exist" containerID="67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.246928 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41"} err="failed to get container status \"67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41\": rpc error: code = NotFound desc = could not find container \"67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41\": container with ID starting with 67c7794cf312ad3878b9a423d2df8bce9f1d4647fb06e28655423e4cea720e41 not found: ID does not exist" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.246960 4707 scope.go:117] "RemoveContainer" containerID="eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b" Jan 29 04:35:14 crc kubenswrapper[4707]: E0129 04:35:14.247613 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b\": container with ID starting with eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b not found: ID does not exist" containerID="eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.247666 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b"} err="failed to get container status \"eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b\": rpc error: code = NotFound desc = could not find container \"eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b\": container with ID starting with eaf87b2653e04a8168af2d906cc94ff6d3d0fe90487a50fb72c92c4acb1dff4b not found: ID does not exist" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.247701 4707 scope.go:117] "RemoveContainer" containerID="7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3" Jan 29 04:35:14 crc kubenswrapper[4707]: E0129 04:35:14.248213 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3\": container with ID starting with 7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3 not found: ID does not exist" containerID="7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3" Jan 29 04:35:14 crc kubenswrapper[4707]: I0129 04:35:14.248240 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3"} err="failed to get container status \"7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3\": rpc error: code = NotFound desc = could not find container \"7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3\": container with ID starting with 7dc1950f532e24049ad2ebbc360bdbeb6d09ef7ded3d62754f70418fad6c80c3 not found: ID does not exist" Jan 29 04:35:15 crc kubenswrapper[4707]: I0129 04:35:15.255798 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfffc755-f95a-4e37-9714-adc7666571c5" path="/var/lib/kubelet/pods/cfffc755-f95a-4e37-9714-adc7666571c5/volumes" Jan 29 04:35:21 crc kubenswrapper[4707]: E0129 04:35:21.720128 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd22e26_c073_47dd_b0b4_4d58d6f93522.slice/crio-660f5e809f6c9f3396452f267f11c769c86b6e49efe35de20fc49e0f486851ff\": RecentStats: unable to find data in memory cache]" Jan 29 04:35:27 crc kubenswrapper[4707]: E0129 04:35:27.282362 4707 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/971c16850c1c68d95a6d8b7170282624891c464ba96e4da23da57810a1a42f58/diff" to get inode usage: stat /var/lib/containers/storage/overlay/971c16850c1c68d95a6d8b7170282624891c464ba96e4da23da57810a1a42f58/diff: no such file or directory, extraDiskErr: Jan 29 04:35:33 crc kubenswrapper[4707]: I0129 04:35:33.463460 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:35:33 crc kubenswrapper[4707]: I0129 04:35:33.464643 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:36:03 crc kubenswrapper[4707]: I0129 04:36:03.463310 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:36:03 crc kubenswrapper[4707]: I0129 04:36:03.464301 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:36:03 crc kubenswrapper[4707]: I0129 04:36:03.464378 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 04:36:03 crc kubenswrapper[4707]: I0129 04:36:03.465565 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51f9ca91f11952f12ffda104ffac66609fac364d355d6649b07810cf32060d88"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 04:36:03 crc kubenswrapper[4707]: I0129 04:36:03.465682 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://51f9ca91f11952f12ffda104ffac66609fac364d355d6649b07810cf32060d88" gracePeriod=600 Jan 29 04:36:03 crc kubenswrapper[4707]: I0129 04:36:03.686064 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="51f9ca91f11952f12ffda104ffac66609fac364d355d6649b07810cf32060d88" exitCode=0 Jan 29 04:36:03 crc kubenswrapper[4707]: I0129 04:36:03.686148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"51f9ca91f11952f12ffda104ffac66609fac364d355d6649b07810cf32060d88"} Jan 29 04:36:03 crc kubenswrapper[4707]: I0129 04:36:03.686455 4707 scope.go:117] "RemoveContainer" containerID="ca78598b3403b895bb1d359c88fd502b9d65283554ce50b6bdc83392bb19193b" Jan 29 04:36:04 crc kubenswrapper[4707]: I0129 04:36:04.697180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6"} Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.947058 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qnn84/must-gather-8wm7f"] Jan 29 04:37:21 crc kubenswrapper[4707]: E0129 04:37:21.948155 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfffc755-f95a-4e37-9714-adc7666571c5" containerName="extract-content" Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.948177 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfffc755-f95a-4e37-9714-adc7666571c5" containerName="extract-content" Jan 29 04:37:21 crc kubenswrapper[4707]: E0129 04:37:21.948207 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfffc755-f95a-4e37-9714-adc7666571c5" containerName="extract-utilities" Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.948215 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfffc755-f95a-4e37-9714-adc7666571c5" containerName="extract-utilities" Jan 29 04:37:21 crc kubenswrapper[4707]: E0129 04:37:21.948258 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfffc755-f95a-4e37-9714-adc7666571c5" containerName="registry-server" Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.948266 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfffc755-f95a-4e37-9714-adc7666571c5" containerName="registry-server" Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.948493 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfffc755-f95a-4e37-9714-adc7666571c5" containerName="registry-server" Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.949821 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.952498 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qnn84"/"openshift-service-ca.crt" Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.952498 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qnn84"/"default-dockercfg-bzlbg" Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.952639 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qnn84"/"kube-root-ca.crt" Jan 29 04:37:21 crc kubenswrapper[4707]: I0129 04:37:21.976865 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qnn84/must-gather-8wm7f"] Jan 29 04:37:22 crc kubenswrapper[4707]: I0129 04:37:22.053971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bf87d95-c954-47d1-a325-b0ab708edcdb-must-gather-output\") pod \"must-gather-8wm7f\" (UID: \"6bf87d95-c954-47d1-a325-b0ab708edcdb\") " pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:37:22 crc kubenswrapper[4707]: I0129 04:37:22.054088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf8nq\" (UniqueName: \"kubernetes.io/projected/6bf87d95-c954-47d1-a325-b0ab708edcdb-kube-api-access-qf8nq\") pod \"must-gather-8wm7f\" (UID: \"6bf87d95-c954-47d1-a325-b0ab708edcdb\") " pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:37:22 crc kubenswrapper[4707]: I0129 04:37:22.156654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bf87d95-c954-47d1-a325-b0ab708edcdb-must-gather-output\") pod \"must-gather-8wm7f\" (UID: \"6bf87d95-c954-47d1-a325-b0ab708edcdb\") " pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:37:22 crc kubenswrapper[4707]: I0129 04:37:22.156759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf8nq\" (UniqueName: \"kubernetes.io/projected/6bf87d95-c954-47d1-a325-b0ab708edcdb-kube-api-access-qf8nq\") pod \"must-gather-8wm7f\" (UID: \"6bf87d95-c954-47d1-a325-b0ab708edcdb\") " pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:37:22 crc kubenswrapper[4707]: I0129 04:37:22.157215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bf87d95-c954-47d1-a325-b0ab708edcdb-must-gather-output\") pod \"must-gather-8wm7f\" (UID: \"6bf87d95-c954-47d1-a325-b0ab708edcdb\") " pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:37:22 crc kubenswrapper[4707]: I0129 04:37:22.177099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf8nq\" (UniqueName: \"kubernetes.io/projected/6bf87d95-c954-47d1-a325-b0ab708edcdb-kube-api-access-qf8nq\") pod \"must-gather-8wm7f\" (UID: \"6bf87d95-c954-47d1-a325-b0ab708edcdb\") " pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:37:22 crc kubenswrapper[4707]: I0129 04:37:22.277804 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:37:23 crc kubenswrapper[4707]: I0129 04:37:23.304561 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qnn84/must-gather-8wm7f"] Jan 29 04:37:23 crc kubenswrapper[4707]: I0129 04:37:23.506268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnn84/must-gather-8wm7f" event={"ID":"6bf87d95-c954-47d1-a325-b0ab708edcdb","Type":"ContainerStarted","Data":"872581ded827f822fea70b1a70739a194131f3fa0a817ab29f654f132d87d937"} Jan 29 04:37:24 crc kubenswrapper[4707]: I0129 04:37:24.516059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnn84/must-gather-8wm7f" event={"ID":"6bf87d95-c954-47d1-a325-b0ab708edcdb","Type":"ContainerStarted","Data":"cdc0823e923de4735f44842004cebdbce192d0c94b646cf4743249b3dfe06a7e"} Jan 29 04:37:24 crc kubenswrapper[4707]: I0129 04:37:24.516388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnn84/must-gather-8wm7f" event={"ID":"6bf87d95-c954-47d1-a325-b0ab708edcdb","Type":"ContainerStarted","Data":"736c2cf5d70a2c41c8e7afb743adbd2a1c0aac1e9d91f51cd8f2ae911856fec0"} Jan 29 04:37:24 crc kubenswrapper[4707]: I0129 04:37:24.536312 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qnn84/must-gather-8wm7f" podStartSLOduration=3.536292972 podStartE2EDuration="3.536292972s" podCreationTimestamp="2026-01-29 04:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:37:24.52909319 +0000 UTC m=+4198.013322095" watchObservedRunningTime="2026-01-29 04:37:24.536292972 +0000 UTC m=+4198.020521877" Jan 29 04:37:26 crc kubenswrapper[4707]: I0129 04:37:26.945266 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qnn84/crc-debug-clqrd"] Jan 29 04:37:26 crc kubenswrapper[4707]: I0129 04:37:26.947554 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.073438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mx4\" (UniqueName: \"kubernetes.io/projected/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-kube-api-access-59mx4\") pod \"crc-debug-clqrd\" (UID: \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\") " pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.073796 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-host\") pod \"crc-debug-clqrd\" (UID: \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\") " pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.175904 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59mx4\" (UniqueName: \"kubernetes.io/projected/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-kube-api-access-59mx4\") pod \"crc-debug-clqrd\" (UID: \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\") " pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.175966 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-host\") pod \"crc-debug-clqrd\" (UID: \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\") " pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.176076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-host\") pod \"crc-debug-clqrd\" (UID: \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\") " pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.196110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mx4\" (UniqueName: \"kubernetes.io/projected/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-kube-api-access-59mx4\") pod \"crc-debug-clqrd\" (UID: \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\") " pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.268377 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.546913 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnn84/crc-debug-clqrd" event={"ID":"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4","Type":"ContainerStarted","Data":"63b566c3f8c3085ad3c860905a4c694e1905e578b6c2abc5e404e827f109b4a7"} Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.547479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnn84/crc-debug-clqrd" event={"ID":"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4","Type":"ContainerStarted","Data":"764303cfa05071e2b17e289d93ac95e21832f2b026f143cc19e8958be3ef0b0b"} Jan 29 04:37:27 crc kubenswrapper[4707]: I0129 04:37:27.568199 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qnn84/crc-debug-clqrd" podStartSLOduration=1.568178874 podStartE2EDuration="1.568178874s" podCreationTimestamp="2026-01-29 04:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 04:37:27.559688486 +0000 UTC m=+4201.043917401" watchObservedRunningTime="2026-01-29 04:37:27.568178874 +0000 UTC m=+4201.052407779" Jan 29 04:37:39 crc kubenswrapper[4707]: I0129 04:37:39.680909 4707 generic.go:334] "Generic (PLEG): container finished" podID="634a7b5a-f37a-4c1b-b5fd-32aea5004bf4" containerID="63b566c3f8c3085ad3c860905a4c694e1905e578b6c2abc5e404e827f109b4a7" exitCode=0 Jan 29 04:37:39 crc kubenswrapper[4707]: I0129 04:37:39.681012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnn84/crc-debug-clqrd" event={"ID":"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4","Type":"ContainerDied","Data":"63b566c3f8c3085ad3c860905a4c694e1905e578b6c2abc5e404e827f109b4a7"} Jan 29 04:37:40 crc kubenswrapper[4707]: I0129 04:37:40.838321 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:40 crc kubenswrapper[4707]: I0129 04:37:40.863116 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59mx4\" (UniqueName: \"kubernetes.io/projected/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-kube-api-access-59mx4\") pod \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\" (UID: \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\") " Jan 29 04:37:40 crc kubenswrapper[4707]: I0129 04:37:40.863394 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-host\") pod \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\" (UID: \"634a7b5a-f37a-4c1b-b5fd-32aea5004bf4\") " Jan 29 04:37:40 crc kubenswrapper[4707]: I0129 04:37:40.864127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-host" (OuterVolumeSpecName: "host") pod "634a7b5a-f37a-4c1b-b5fd-32aea5004bf4" (UID: "634a7b5a-f37a-4c1b-b5fd-32aea5004bf4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 04:37:40 crc kubenswrapper[4707]: I0129 04:37:40.881003 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-kube-api-access-59mx4" (OuterVolumeSpecName: "kube-api-access-59mx4") pod "634a7b5a-f37a-4c1b-b5fd-32aea5004bf4" (UID: "634a7b5a-f37a-4c1b-b5fd-32aea5004bf4"). InnerVolumeSpecName "kube-api-access-59mx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:37:40 crc kubenswrapper[4707]: I0129 04:37:40.888308 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qnn84/crc-debug-clqrd"] Jan 29 04:37:40 crc kubenswrapper[4707]: I0129 04:37:40.902288 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qnn84/crc-debug-clqrd"] Jan 29 04:37:40 crc kubenswrapper[4707]: I0129 04:37:40.965638 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59mx4\" (UniqueName: \"kubernetes.io/projected/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-kube-api-access-59mx4\") on node \"crc\" DevicePath \"\"" Jan 29 04:37:40 crc kubenswrapper[4707]: I0129 04:37:40.965687 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4-host\") on node \"crc\" DevicePath \"\"" Jan 29 04:37:41 crc kubenswrapper[4707]: I0129 04:37:41.256894 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634a7b5a-f37a-4c1b-b5fd-32aea5004bf4" path="/var/lib/kubelet/pods/634a7b5a-f37a-4c1b-b5fd-32aea5004bf4/volumes" Jan 29 04:37:41 crc kubenswrapper[4707]: I0129 04:37:41.699310 4707 scope.go:117] "RemoveContainer" containerID="63b566c3f8c3085ad3c860905a4c694e1905e578b6c2abc5e404e827f109b4a7" Jan 29 04:37:41 crc kubenswrapper[4707]: I0129 04:37:41.699355 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/crc-debug-clqrd" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.090882 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qnn84/crc-debug-gnz54"] Jan 29 04:37:42 crc kubenswrapper[4707]: E0129 04:37:42.091455 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634a7b5a-f37a-4c1b-b5fd-32aea5004bf4" containerName="container-00" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.091470 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="634a7b5a-f37a-4c1b-b5fd-32aea5004bf4" containerName="container-00" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.091770 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="634a7b5a-f37a-4c1b-b5fd-32aea5004bf4" containerName="container-00" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.092523 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.290479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kvn6\" (UniqueName: \"kubernetes.io/projected/f5b820f4-a051-4ee1-9aab-9ed782eedd01-kube-api-access-9kvn6\") pod \"crc-debug-gnz54\" (UID: \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\") " pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.290760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5b820f4-a051-4ee1-9aab-9ed782eedd01-host\") pod \"crc-debug-gnz54\" (UID: \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\") " pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.392555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5b820f4-a051-4ee1-9aab-9ed782eedd01-host\") pod \"crc-debug-gnz54\" (UID: \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\") " pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.392625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kvn6\" (UniqueName: \"kubernetes.io/projected/f5b820f4-a051-4ee1-9aab-9ed782eedd01-kube-api-access-9kvn6\") pod \"crc-debug-gnz54\" (UID: \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\") " pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.392808 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5b820f4-a051-4ee1-9aab-9ed782eedd01-host\") pod \"crc-debug-gnz54\" (UID: \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\") " pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.410994 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kvn6\" (UniqueName: \"kubernetes.io/projected/f5b820f4-a051-4ee1-9aab-9ed782eedd01-kube-api-access-9kvn6\") pod \"crc-debug-gnz54\" (UID: \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\") " pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:42 crc kubenswrapper[4707]: I0129 04:37:42.712579 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:43 crc kubenswrapper[4707]: I0129 04:37:43.728419 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5b820f4-a051-4ee1-9aab-9ed782eedd01" containerID="a11fbaea0b3d71951090ee573c9ba632b905120fa20becaa59c82829d62991da" exitCode=1 Jan 29 04:37:43 crc kubenswrapper[4707]: I0129 04:37:43.728527 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnn84/crc-debug-gnz54" event={"ID":"f5b820f4-a051-4ee1-9aab-9ed782eedd01","Type":"ContainerDied","Data":"a11fbaea0b3d71951090ee573c9ba632b905120fa20becaa59c82829d62991da"} Jan 29 04:37:43 crc kubenswrapper[4707]: I0129 04:37:43.728689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnn84/crc-debug-gnz54" event={"ID":"f5b820f4-a051-4ee1-9aab-9ed782eedd01","Type":"ContainerStarted","Data":"104b8e9e5890afa3f42d34824197406a69e8bdaa5dd259ec20d4afd451496b9d"} Jan 29 04:37:43 crc kubenswrapper[4707]: I0129 04:37:43.794097 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qnn84/crc-debug-gnz54"] Jan 29 04:37:43 crc kubenswrapper[4707]: I0129 04:37:43.816110 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qnn84/crc-debug-gnz54"] Jan 29 04:37:45 crc kubenswrapper[4707]: I0129 04:37:45.219984 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:45 crc kubenswrapper[4707]: I0129 04:37:45.352640 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5b820f4-a051-4ee1-9aab-9ed782eedd01-host\") pod \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\" (UID: \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\") " Jan 29 04:37:45 crc kubenswrapper[4707]: I0129 04:37:45.352761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5b820f4-a051-4ee1-9aab-9ed782eedd01-host" (OuterVolumeSpecName: "host") pod "f5b820f4-a051-4ee1-9aab-9ed782eedd01" (UID: "f5b820f4-a051-4ee1-9aab-9ed782eedd01"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 04:37:45 crc kubenswrapper[4707]: I0129 04:37:45.352949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kvn6\" (UniqueName: \"kubernetes.io/projected/f5b820f4-a051-4ee1-9aab-9ed782eedd01-kube-api-access-9kvn6\") pod \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\" (UID: \"f5b820f4-a051-4ee1-9aab-9ed782eedd01\") " Jan 29 04:37:45 crc kubenswrapper[4707]: I0129 04:37:45.353608 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5b820f4-a051-4ee1-9aab-9ed782eedd01-host\") on node \"crc\" DevicePath \"\"" Jan 29 04:37:45 crc kubenswrapper[4707]: I0129 04:37:45.359619 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b820f4-a051-4ee1-9aab-9ed782eedd01-kube-api-access-9kvn6" (OuterVolumeSpecName: "kube-api-access-9kvn6") pod "f5b820f4-a051-4ee1-9aab-9ed782eedd01" (UID: "f5b820f4-a051-4ee1-9aab-9ed782eedd01"). InnerVolumeSpecName "kube-api-access-9kvn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:37:45 crc kubenswrapper[4707]: I0129 04:37:45.455394 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kvn6\" (UniqueName: \"kubernetes.io/projected/f5b820f4-a051-4ee1-9aab-9ed782eedd01-kube-api-access-9kvn6\") on node \"crc\" DevicePath \"\"" Jan 29 04:37:45 crc kubenswrapper[4707]: I0129 04:37:45.745777 4707 scope.go:117] "RemoveContainer" containerID="a11fbaea0b3d71951090ee573c9ba632b905120fa20becaa59c82829d62991da" Jan 29 04:37:45 crc kubenswrapper[4707]: I0129 04:37:45.745943 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/crc-debug-gnz54" Jan 29 04:37:47 crc kubenswrapper[4707]: I0129 04:37:47.254827 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5b820f4-a051-4ee1-9aab-9ed782eedd01" path="/var/lib/kubelet/pods/f5b820f4-a051-4ee1-9aab-9ed782eedd01/volumes" Jan 29 04:38:03 crc kubenswrapper[4707]: I0129 04:38:03.464134 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:38:03 crc kubenswrapper[4707]: I0129 04:38:03.464685 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:38:33 crc kubenswrapper[4707]: I0129 04:38:33.463663 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:38:33 crc kubenswrapper[4707]: I0129 04:38:33.464255 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:38:55 crc kubenswrapper[4707]: I0129 04:38:55.370051 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_21d0ba1c-ab07-48da-8e34-93da9d1c9c6a/init-config-reloader/0.log" Jan 29 04:38:55 crc kubenswrapper[4707]: I0129 04:38:55.600983 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_21d0ba1c-ab07-48da-8e34-93da9d1c9c6a/init-config-reloader/0.log" Jan 29 04:38:55 crc kubenswrapper[4707]: I0129 04:38:55.642150 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_21d0ba1c-ab07-48da-8e34-93da9d1c9c6a/config-reloader/0.log" Jan 29 04:38:55 crc kubenswrapper[4707]: I0129 04:38:55.668872 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_21d0ba1c-ab07-48da-8e34-93da9d1c9c6a/alertmanager/0.log" Jan 29 04:38:55 crc kubenswrapper[4707]: I0129 04:38:55.819951 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_1c46106d-ca5d-4cac-820f-cf2935abf6d8/aodh-api/0.log" Jan 29 04:38:55 crc kubenswrapper[4707]: I0129 04:38:55.904664 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_1c46106d-ca5d-4cac-820f-cf2935abf6d8/aodh-evaluator/0.log" Jan 29 04:38:55 crc kubenswrapper[4707]: I0129 04:38:55.915914 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_1c46106d-ca5d-4cac-820f-cf2935abf6d8/aodh-listener/0.log" Jan 29 04:38:56 crc kubenswrapper[4707]: I0129 04:38:56.045042 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_1c46106d-ca5d-4cac-820f-cf2935abf6d8/aodh-notifier/0.log" Jan 29 04:38:56 crc kubenswrapper[4707]: I0129 04:38:56.721698 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7665bc55c6-vnwk8_374bee16-aeed-4b53-845a-494375d065f6/barbican-keystone-listener/0.log" Jan 29 04:38:56 crc kubenswrapper[4707]: I0129 04:38:56.722258 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6cf5fb45fd-lqs99_27a75d55-3866-4c55-bffb-8f1f1d53b687/barbican-api/0.log" Jan 29 04:38:56 crc kubenswrapper[4707]: I0129 04:38:56.741441 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6cf5fb45fd-lqs99_27a75d55-3866-4c55-bffb-8f1f1d53b687/barbican-api-log/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.128509 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7665bc55c6-vnwk8_374bee16-aeed-4b53-845a-494375d065f6/barbican-keystone-listener-log/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.159224 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-78bfcf785f-txfhz_cd6d292d-51ed-4b96-89e1-06220cd5f98b/barbican-worker/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.301323 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-78bfcf785f-txfhz_cd6d292d-51ed-4b96-89e1-06220cd5f98b/barbican-worker-log/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.437016 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9x2sz_4ed3ca47-cf57-4534-b12c-2aa6c2be26cd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.607816 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7c20a992-d535-46ad-9cc4-f2348c18f7ca/ceilometer-central-agent/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.678531 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7c20a992-d535-46ad-9cc4-f2348c18f7ca/ceilometer-notification-agent/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.742127 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7c20a992-d535-46ad-9cc4-f2348c18f7ca/proxy-httpd/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.790021 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7c20a992-d535-46ad-9cc4-f2348c18f7ca/sg-core/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.971762 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b270d5b9-7b1a-44b2-b915-4f63e06a10eb/cinder-api/0.log" Jan 29 04:38:57 crc kubenswrapper[4707]: I0129 04:38:57.994647 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b270d5b9-7b1a-44b2-b915-4f63e06a10eb/cinder-api-log/0.log" Jan 29 04:38:58 crc kubenswrapper[4707]: I0129 04:38:58.198936 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_539d5b33-91ee-4790-941f-22c82388ed87/probe/0.log" Jan 29 04:38:58 crc kubenswrapper[4707]: I0129 04:38:58.201854 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_539d5b33-91ee-4790-941f-22c82388ed87/cinder-scheduler/0.log" Jan 29 04:38:58 crc kubenswrapper[4707]: I0129 04:38:58.294746 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7vkk7_98ca67a4-de19-4954-a988-1c743df160cd/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:38:58 crc kubenswrapper[4707]: I0129 04:38:58.420496 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-nhhr2_933d5dc9-d255-45c9-837d-251701e8fd77/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:38:58 crc kubenswrapper[4707]: I0129 04:38:58.553680 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-4wd7q_fdf17e92-84cc-4d06-ba4f-714cfd41c134/init/0.log" Jan 29 04:38:58 crc kubenswrapper[4707]: I0129 04:38:58.860776 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-4wd7q_fdf17e92-84cc-4d06-ba4f-714cfd41c134/dnsmasq-dns/0.log" Jan 29 04:38:58 crc kubenswrapper[4707]: I0129 04:38:58.862279 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-4wd7q_fdf17e92-84cc-4d06-ba4f-714cfd41c134/init/0.log" Jan 29 04:38:58 crc kubenswrapper[4707]: I0129 04:38:58.883909 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-x2dnt_43b1dffd-18d4-4201-9a3f-5ef4db33c8b7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:38:59 crc kubenswrapper[4707]: I0129 04:38:59.085251 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_46ce0794-979b-4f4c-9a41-b895bbc25d0c/glance-log/0.log" Jan 29 04:38:59 crc kubenswrapper[4707]: I0129 04:38:59.099235 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9a1690ce-5c45-4a23-abd5-a1521acd3f82/glance-httpd/0.log" Jan 29 04:38:59 crc kubenswrapper[4707]: I0129 04:38:59.134514 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_46ce0794-979b-4f4c-9a41-b895bbc25d0c/glance-httpd/0.log" Jan 29 04:38:59 crc kubenswrapper[4707]: I0129 04:38:59.179036 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9a1690ce-5c45-4a23-abd5-a1521acd3f82/glance-log/0.log" Jan 29 04:38:59 crc kubenswrapper[4707]: I0129 04:38:59.612747 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-7b6695978c-nxvfw_52ddee41-4c3c-4b7b-b637-2de751496d37/heat-engine/0.log" Jan 29 04:38:59 crc kubenswrapper[4707]: I0129 04:38:59.760441 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-lcq4p_945bd58d-5ea2-4118-a675-3b7b127d9d4c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:38:59 crc kubenswrapper[4707]: I0129 04:38:59.791859 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-74d4777d5f-4mj7v_9b19f31f-481f-4feb-91bb-09df20de5654/heat-api/0.log" Jan 29 04:38:59 crc kubenswrapper[4707]: I0129 04:38:59.836718 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-78c45ff765-hw8sk_0f27421c-79af-4e0d-b97f-c1d73b2524e2/heat-cfnapi/0.log" Jan 29 04:39:00 crc kubenswrapper[4707]: I0129 04:39:00.002360 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-b4b42_1503434a-951a-4e31-836e-c1f37b794d45/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:00 crc kubenswrapper[4707]: I0129 04:39:00.120862 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6568fdcd45-j5nxz_0542ad30-5c42-4464-83b9-3faebd15a9ea/keystone-api/0.log" Jan 29 04:39:00 crc kubenswrapper[4707]: I0129 04:39:00.151711 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29494321-prr9g_b1afb9c0-b9e9-46d1-b608-36148c671d74/keystone-cron/0.log" Jan 29 04:39:00 crc kubenswrapper[4707]: I0129 04:39:00.270693 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6f586474-1963-4702-81bf-36d31bf0a3ae/kube-state-metrics/0.log" Jan 29 04:39:00 crc kubenswrapper[4707]: I0129 04:39:00.608028 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-2rbjz_a019e4eb-4ee9-4426-bde0-9c6b0319283f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:00 crc kubenswrapper[4707]: I0129 04:39:00.822564 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f8ffb664f-gwtlc_4d553753-4701-4a28-81dd-f7d0fbe719d6/neutron-api/0.log" Jan 29 04:39:00 crc kubenswrapper[4707]: I0129 04:39:00.848971 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f8ffb664f-gwtlc_4d553753-4701-4a28-81dd-f7d0fbe719d6/neutron-httpd/0.log" Jan 29 04:39:01 crc kubenswrapper[4707]: I0129 04:39:01.038938 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-st77n_9f02add7-c3ef-4952-b83f-1799bf08bad0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:01 crc kubenswrapper[4707]: I0129 04:39:01.356036 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ce279671-2df3-4af7-a6bb-2ac9fdc048da/nova-api-log/0.log" Jan 29 04:39:01 crc kubenswrapper[4707]: I0129 04:39:01.491285 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_66aa645d-9edf-4791-a8f7-2607ad442104/nova-cell0-conductor-conductor/0.log" Jan 29 04:39:01 crc kubenswrapper[4707]: I0129 04:39:01.751680 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ce279671-2df3-4af7-a6bb-2ac9fdc048da/nova-api-api/0.log" Jan 29 04:39:01 crc kubenswrapper[4707]: I0129 04:39:01.766808 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_47acef2f-13c3-47cd-b61f-a65e20f570a4/nova-cell1-conductor-conductor/0.log" Jan 29 04:39:01 crc kubenswrapper[4707]: I0129 04:39:01.910495 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_99c8a2aa-31c2-4927-af04-8f5e8c50198e/nova-cell1-novncproxy-novncproxy/0.log" Jan 29 04:39:01 crc kubenswrapper[4707]: I0129 04:39:01.961501 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-2b5nk_018b06ef-5822-4b5e-ae32-43bf56e40f19/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:02 crc kubenswrapper[4707]: I0129 04:39:02.347177 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df23cf25-bfda-4999-85bf-ef4af0738ece/nova-metadata-log/0.log" Jan 29 04:39:02 crc kubenswrapper[4707]: I0129 04:39:02.538779 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7164be40-3659-450f-885f-db200baa5ed2/nova-scheduler-scheduler/0.log" Jan 29 04:39:02 crc kubenswrapper[4707]: I0129 04:39:02.565124 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6aaab4a-5490-4d22-ac2a-e346a1371683/mysql-bootstrap/0.log" Jan 29 04:39:02 crc kubenswrapper[4707]: I0129 04:39:02.803600 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6aaab4a-5490-4d22-ac2a-e346a1371683/mysql-bootstrap/0.log" Jan 29 04:39:02 crc kubenswrapper[4707]: I0129 04:39:02.831714 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6aaab4a-5490-4d22-ac2a-e346a1371683/galera/0.log" Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.462863 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.462912 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.462954 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.463733 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6"} pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.463787 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" containerID="cri-o://4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" gracePeriod=600 Jan 29 04:39:03 crc kubenswrapper[4707]: E0129 04:39:03.601821 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.652270 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8dbb64e8-99fc-4b59-abdc-fce36a90b82f/mysql-bootstrap/0.log" Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.743217 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df23cf25-bfda-4999-85bf-ef4af0738ece/nova-metadata-metadata/0.log" Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.844027 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8dbb64e8-99fc-4b59-abdc-fce36a90b82f/mysql-bootstrap/0.log" Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.875725 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8dbb64e8-99fc-4b59-abdc-fce36a90b82f/galera/0.log" Jan 29 04:39:03 crc kubenswrapper[4707]: I0129 04:39:03.974707 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2afcd46a-a1c0-41cf-866e-3a39e0ac9a36/openstackclient/0.log" Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.116185 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hpq5q_9f831116-140a-4c6b-8d7c-aad99fcaf97c/ovn-controller/0.log" Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.144735 4707 generic.go:334] "Generic (PLEG): container finished" podID="df12d101-b13d-4276-94b7-422c6609d2e8" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" exitCode=0 Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.144782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerDied","Data":"4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6"} Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.144824 4707 scope.go:117] "RemoveContainer" containerID="51f9ca91f11952f12ffda104ffac66609fac364d355d6649b07810cf32060d88" Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.145681 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:39:04 crc kubenswrapper[4707]: E0129 04:39:04.146104 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.257108 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6rw89_3bc182e0-848e-43b3-8d1e-920440755bca/openstack-network-exporter/0.log" Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.398597 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hxz2d_f25cc401-b568-4936-9947-2a54b5f6dea9/ovsdb-server-init/0.log" Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.727843 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hxz2d_f25cc401-b568-4936-9947-2a54b5f6dea9/ovsdb-server/0.log" Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.806623 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hxz2d_f25cc401-b568-4936-9947-2a54b5f6dea9/ovs-vswitchd/0.log" Jan 29 04:39:04 crc kubenswrapper[4707]: I0129 04:39:04.831602 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hxz2d_f25cc401-b568-4936-9947-2a54b5f6dea9/ovsdb-server-init/0.log" Jan 29 04:39:05 crc kubenswrapper[4707]: I0129 04:39:05.527843 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bzfvj_80667caf-0ec4-4178-96b2-93b148db9c1e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:05 crc kubenswrapper[4707]: I0129 04:39:05.609731 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_79b44fdd-6478-42a0-9817-b3d949683532/openstack-network-exporter/0.log" Jan 29 04:39:05 crc kubenswrapper[4707]: I0129 04:39:05.610012 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_79b44fdd-6478-42a0-9817-b3d949683532/ovn-northd/0.log" Jan 29 04:39:05 crc kubenswrapper[4707]: I0129 04:39:05.795140 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_445c0ce8-31bb-4f8a-a139-e1d7a63d38f7/openstack-network-exporter/0.log" Jan 29 04:39:05 crc kubenswrapper[4707]: I0129 04:39:05.843250 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_445c0ce8-31bb-4f8a-a139-e1d7a63d38f7/ovsdbserver-nb/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.000013 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b5dee206-46c6-44c4-885d-0d8ba9149bfd/openstack-network-exporter/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.047132 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b5dee206-46c6-44c4-885d-0d8ba9149bfd/ovsdbserver-sb/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.250280 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9f459ff7d-tkv2s_2ceecbc6-bf80-4008-80e3-0a43426cf4c6/placement-log/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.304129 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9f459ff7d-tkv2s_2ceecbc6-bf80-4008-80e3-0a43426cf4c6/placement-api/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.354188 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/init-config-reloader/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.583995 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/thanos-sidecar/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.592177 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/init-config-reloader/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.653300 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/config-reloader/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.655583 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_83c2f62d-5b16-40f5-bc31-1da853f155b9/prometheus/0.log" Jan 29 04:39:06 crc kubenswrapper[4707]: I0129 04:39:06.848787 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_688208b9-5567-4a47-9ec9-76ce03ec8991/setup-container/0.log" Jan 29 04:39:07 crc kubenswrapper[4707]: I0129 04:39:07.064731 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_688208b9-5567-4a47-9ec9-76ce03ec8991/rabbitmq/0.log" Jan 29 04:39:07 crc kubenswrapper[4707]: I0129 04:39:07.092883 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fb587ed3-9015-4748-a28b-10d4132ffdfb/setup-container/0.log" Jan 29 04:39:07 crc kubenswrapper[4707]: I0129 04:39:07.114663 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_688208b9-5567-4a47-9ec9-76ce03ec8991/setup-container/0.log" Jan 29 04:39:07 crc kubenswrapper[4707]: I0129 04:39:07.291735 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fb587ed3-9015-4748-a28b-10d4132ffdfb/setup-container/0.log" Jan 29 04:39:07 crc kubenswrapper[4707]: I0129 04:39:07.378221 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fb587ed3-9015-4748-a28b-10d4132ffdfb/rabbitmq/0.log" Jan 29 04:39:07 crc kubenswrapper[4707]: I0129 04:39:07.476633 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-f4lcm_c932c837-2020-4db4-8598-c4803eff8029/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:07 crc kubenswrapper[4707]: I0129 04:39:07.693072 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-5nrs9_7b03e702-6a8f-4bcc-8be0-4ec0eaf53900/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:07 crc kubenswrapper[4707]: I0129 04:39:07.736784 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6jr5s_3b1ed1fd-6748-40d9-b458-ca63cb4479e0/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:07 crc kubenswrapper[4707]: I0129 04:39:07.989875 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-ncft6_21dfe5ce-4935-46c4-8124-cb4fecf0a906/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:08 crc kubenswrapper[4707]: I0129 04:39:08.153501 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-r7f6d_c8e32f79-d8b0-424c-a23e-fe94623016de/ssh-known-hosts-edpm-deployment/0.log" Jan 29 04:39:08 crc kubenswrapper[4707]: I0129 04:39:08.223753 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67b9cbc75f-dv5cr_57f35f5f-1517-41b4-b354-59fd90d8fea5/proxy-server/0.log" Jan 29 04:39:08 crc kubenswrapper[4707]: I0129 04:39:08.350371 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67b9cbc75f-dv5cr_57f35f5f-1517-41b4-b354-59fd90d8fea5/proxy-httpd/0.log" Jan 29 04:39:08 crc kubenswrapper[4707]: I0129 04:39:08.359393 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-bxs5q_b9e5582d-dd71-4ccd-84ea-bc133dce917c/swift-ring-rebalance/0.log" Jan 29 04:39:08 crc kubenswrapper[4707]: I0129 04:39:08.826439 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/account-auditor/0.log" Jan 29 04:39:08 crc kubenswrapper[4707]: I0129 04:39:08.827776 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/account-replicator/0.log" Jan 29 04:39:08 crc kubenswrapper[4707]: I0129 04:39:08.832163 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/account-reaper/0.log" Jan 29 04:39:08 crc kubenswrapper[4707]: I0129 04:39:08.900350 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/account-server/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.038789 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/container-server/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.101252 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/container-auditor/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.125310 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/container-replicator/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.141859 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/container-updater/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.289231 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-auditor/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.347062 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-expirer/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.436855 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-replicator/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.451716 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-server/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.522152 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/object-updater/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.604065 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/rsync/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.666240 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_249edadf-1bb4-4d39-aae3-40384ba10bae/swift-recon-cron/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.871206 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-sqr8r_b8c300e6-01c5-493d-b263-2b6cdfaba0c9/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:09 crc kubenswrapper[4707]: I0129 04:39:09.934194 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-892tt_bf895266-d1e5-47d5-8d3a-397acefb3f9b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 29 04:39:15 crc kubenswrapper[4707]: I0129 04:39:15.244439 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:39:15 crc kubenswrapper[4707]: E0129 04:39:15.245849 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:39:17 crc kubenswrapper[4707]: I0129 04:39:17.889622 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_33afc350-9c09-4d5f-aa86-80ccc0b670ba/memcached/0.log" Jan 29 04:39:26 crc kubenswrapper[4707]: I0129 04:39:26.243374 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:39:26 crc kubenswrapper[4707]: E0129 04:39:26.244672 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:39:40 crc kubenswrapper[4707]: I0129 04:39:40.244571 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:39:40 crc kubenswrapper[4707]: E0129 04:39:40.245464 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:39:43 crc kubenswrapper[4707]: I0129 04:39:43.323355 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/util/0.log" Jan 29 04:39:43 crc kubenswrapper[4707]: I0129 04:39:43.485270 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/util/0.log" Jan 29 04:39:43 crc kubenswrapper[4707]: I0129 04:39:43.554176 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/pull/0.log" Jan 29 04:39:43 crc kubenswrapper[4707]: I0129 04:39:43.636162 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/pull/0.log" Jan 29 04:39:43 crc kubenswrapper[4707]: I0129 04:39:43.767852 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/util/0.log" Jan 29 04:39:43 crc kubenswrapper[4707]: I0129 04:39:43.801198 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/extract/0.log" Jan 29 04:39:43 crc kubenswrapper[4707]: I0129 04:39:43.814745 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8275b17851ff8fc62fd6ffae3af38d3589b7d5e36b92778b1fd9f59364qts6v_882475a0-6529-4596-9104-2c7ec1c2e414/pull/0.log" Jan 29 04:39:44 crc kubenswrapper[4707]: I0129 04:39:44.014103 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6bc7f4f4cf-9xlqh_6dbe27ba-a451-4202-8f58-73cb0684bfea/manager/0.log" Jan 29 04:39:44 crc kubenswrapper[4707]: I0129 04:39:44.067484 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-f6487bd57-qbdg4_155a3715-4600-4f83-8db3-a6beaf5c3394/manager/0.log" Jan 29 04:39:44 crc kubenswrapper[4707]: I0129 04:39:44.223780 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66dfbd6f5d-jlsmf_f064b8fa-dd53-4fd8-8440-9e517b1c1279/manager/0.log" Jan 29 04:39:44 crc kubenswrapper[4707]: I0129 04:39:44.394786 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-6db5dbd896-c2sgx_3445268f-15c8-4438-8fd1-a13d2bd9981d/manager/0.log" Jan 29 04:39:44 crc kubenswrapper[4707]: I0129 04:39:44.556377 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-587c6bfdcf-c8v2v_de21d951-1d0b-415e-8923-5fa2cc58e439/manager/0.log" Jan 29 04:39:44 crc kubenswrapper[4707]: I0129 04:39:44.628971 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-z2vvp_59a9dc92-c9db-4bfa-8233-88b1690beaad/manager/0.log" Jan 29 04:39:44 crc kubenswrapper[4707]: I0129 04:39:44.878399 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-958664b5-qgj46_8255e85b-8815-4860-9325-7570ba9a6fd9/manager/0.log" Jan 29 04:39:44 crc kubenswrapper[4707]: I0129 04:39:44.990997 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-4wck9_09633ead-78c6-4934-95c2-05b24c6fc3e5/manager/0.log" Jan 29 04:39:45 crc kubenswrapper[4707]: I0129 04:39:45.161401 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6978b79747-49nhf_6fa97c9f-4b04-4795-9f11-9790c692ba0f/manager/0.log" Jan 29 04:39:45 crc kubenswrapper[4707]: I0129 04:39:45.218577 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-765668569f-jq2z4_02f283f2-5bf1-4ee7-ac34-751ffc96421c/manager/0.log" Jan 29 04:39:45 crc kubenswrapper[4707]: I0129 04:39:45.382366 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-jvdtd_55320b88-8f86-47bd-8718-6cabd0865a1c/manager/0.log" Jan 29 04:39:45 crc kubenswrapper[4707]: I0129 04:39:45.508947 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-694c5bfc85-g6dzc_7d2c1f08-0b63-4368-a7cc-9374d0dbf035/manager/0.log" Jan 29 04:39:45 crc kubenswrapper[4707]: I0129 04:39:45.727789 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-ddcbfd695-mrfw2_e15cd320-f902-4d99-8037-5c9355f4a833/manager/0.log" Jan 29 04:39:45 crc kubenswrapper[4707]: I0129 04:39:45.773365 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5c765b4558-5jtr8_26d2ace7-4405-480c-acf8-233e1511007f/manager/0.log" Jan 29 04:39:45 crc kubenswrapper[4707]: I0129 04:39:45.895216 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4ddvspv_a6df0676-63de-4a83-bc60-9b69a2f8777f/manager/0.log" Jan 29 04:39:46 crc kubenswrapper[4707]: I0129 04:39:46.080079 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6955d4df64-rgqtz_76e29c5c-b257-48b7-953c-d7db3c6407ed/operator/0.log" Jan 29 04:39:46 crc kubenswrapper[4707]: I0129 04:39:46.297704 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-pnmmg_2553fa13-b0b4-45c7-9317-f6be21e7c1f0/registry-server/0.log" Jan 29 04:39:46 crc kubenswrapper[4707]: I0129 04:39:46.803986 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-4cp2t_e0372a1a-cd84-491e-a3d1-f58389a66b63/manager/0.log" Jan 29 04:39:46 crc kubenswrapper[4707]: I0129 04:39:46.968141 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-wblrt_f57d529d-1352-47d9-baa8-a2f383374b35/manager/0.log" Jan 29 04:39:47 crc kubenswrapper[4707]: I0129 04:39:47.214841 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-b6ggj_6fb86866-7c9d-4b4f-bf81-8a36898aca3d/operator/0.log" Jan 29 04:39:47 crc kubenswrapper[4707]: I0129 04:39:47.358771 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-zmvww_b7b5c12b-680b-4814-906c-62c9f8702559/manager/0.log" Jan 29 04:39:47 crc kubenswrapper[4707]: I0129 04:39:47.359582 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-cc96c49b6-x4zwn_d938abde-b4d6-4d4e-a176-9ed92ac5325d/manager/0.log" Jan 29 04:39:47 crc kubenswrapper[4707]: I0129 04:39:47.536814 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7886d5cc69-w8rzq_0a32b73c-f66f-425f-81a9-ef1cc36041d4/manager/0.log" Jan 29 04:39:47 crc kubenswrapper[4707]: I0129 04:39:47.650304 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-lnfzj_6e5e159a-c89d-43cf-b9cf-4a92de09ac22/manager/0.log" Jan 29 04:39:47 crc kubenswrapper[4707]: I0129 04:39:47.710472 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-767b8bc766-mhnm2_706ea7e5-d8b2-4bc1-900b-d62dddcad89e/manager/0.log" Jan 29 04:39:51 crc kubenswrapper[4707]: I0129 04:39:51.244013 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:39:51 crc kubenswrapper[4707]: E0129 04:39:51.244757 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:40:04 crc kubenswrapper[4707]: I0129 04:40:04.243363 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:40:04 crc kubenswrapper[4707]: E0129 04:40:04.244135 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:40:09 crc kubenswrapper[4707]: I0129 04:40:09.567367 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bzqlt_e8275a2b-4124-46d8-b2f1-4a7e8401e369/control-plane-machine-set-operator/0.log" Jan 29 04:40:09 crc kubenswrapper[4707]: I0129 04:40:09.769981 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mtt54_23df2202-fce8-4515-b147-1256fe6d953b/kube-rbac-proxy/0.log" Jan 29 04:40:09 crc kubenswrapper[4707]: I0129 04:40:09.808953 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mtt54_23df2202-fce8-4515-b147-1256fe6d953b/machine-api-operator/0.log" Jan 29 04:40:15 crc kubenswrapper[4707]: I0129 04:40:15.244350 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:40:15 crc kubenswrapper[4707]: E0129 04:40:15.245052 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:40:22 crc kubenswrapper[4707]: I0129 04:40:22.503196 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-khpr5_2eac8e69-1dad-4d12-bdc5-24cb7659f04d/cert-manager-controller/0.log" Jan 29 04:40:22 crc kubenswrapper[4707]: I0129 04:40:22.687842 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-h296l_ef3e8867-84ba-49f9-878e-482ae14faaa7/cert-manager-cainjector/0.log" Jan 29 04:40:22 crc kubenswrapper[4707]: I0129 04:40:22.791498 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vbkth_e32edc95-69cc-48f9-8840-a8bba34d4649/cert-manager-webhook/0.log" Jan 29 04:40:26 crc kubenswrapper[4707]: I0129 04:40:26.244228 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:40:26 crc kubenswrapper[4707]: E0129 04:40:26.245000 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:40:36 crc kubenswrapper[4707]: I0129 04:40:36.605481 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-rkzwz_4657ed73-851c-43e4-9f85-e06471c81722/nmstate-console-plugin/0.log" Jan 29 04:40:36 crc kubenswrapper[4707]: I0129 04:40:36.781970 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kdqjs_966ffde7-06ec-4066-b9db-b4b1e750095f/kube-rbac-proxy/0.log" Jan 29 04:40:36 crc kubenswrapper[4707]: I0129 04:40:36.792243 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xvfr7_35c9d599-ff9e-4713-8c0d-6e72c41f6859/nmstate-handler/0.log" Jan 29 04:40:36 crc kubenswrapper[4707]: I0129 04:40:36.848759 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kdqjs_966ffde7-06ec-4066-b9db-b4b1e750095f/nmstate-metrics/0.log" Jan 29 04:40:36 crc kubenswrapper[4707]: I0129 04:40:36.972580 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-6497v_172d1247-a499-49cf-a003-3c70d059385f/nmstate-operator/0.log" Jan 29 04:40:37 crc kubenswrapper[4707]: I0129 04:40:37.030161 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-7rh4l_7eec8389-133d-412b-a2f6-813eaf6e6468/nmstate-webhook/0.log" Jan 29 04:40:39 crc kubenswrapper[4707]: I0129 04:40:39.243452 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:40:39 crc kubenswrapper[4707]: E0129 04:40:39.244149 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:40:52 crc kubenswrapper[4707]: I0129 04:40:52.173221 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-zggfz_aac893f7-df17-486c-895f-b5305b76bc60/prometheus-operator/0.log" Jan 29 04:40:52 crc kubenswrapper[4707]: I0129 04:40:52.388474 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd586f795-dhgdl_23eb4701-1c82-40e4-990b-87c4044f51cc/prometheus-operator-admission-webhook/0.log" Jan 29 04:40:52 crc kubenswrapper[4707]: I0129 04:40:52.472283 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd586f795-qmjjg_fd346f51-b69b-4ac8-b4d3-d24201dd0015/prometheus-operator-admission-webhook/0.log" Jan 29 04:40:52 crc kubenswrapper[4707]: I0129 04:40:52.574844 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-gkwm8_cb2589dc-26af-42a0-8fb9-8f908a0fbac9/operator/0.log" Jan 29 04:40:52 crc kubenswrapper[4707]: I0129 04:40:52.701815 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-b6clh_1ab8ee44-5e15-42f6-861c-071cb82c90d4/perses-operator/0.log" Jan 29 04:40:53 crc kubenswrapper[4707]: I0129 04:40:53.243888 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:40:53 crc kubenswrapper[4707]: E0129 04:40:53.244326 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:41:04 crc kubenswrapper[4707]: I0129 04:41:04.243060 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:41:04 crc kubenswrapper[4707]: E0129 04:41:04.243799 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:41:09 crc kubenswrapper[4707]: I0129 04:41:09.977370 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-r9vjq_063768b8-90c6-4b82-b3d2-13fbdc42bab5/kube-rbac-proxy/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.063652 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-r9vjq_063768b8-90c6-4b82-b3d2-13fbdc42bab5/controller/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.176301 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-frr-files/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.370825 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-frr-files/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.379038 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-reloader/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.413077 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-reloader/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.460083 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-metrics/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.683501 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-frr-files/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.700222 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-metrics/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.701236 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-reloader/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.712652 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-metrics/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.938870 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-reloader/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.945693 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-frr-files/0.log" Jan 29 04:41:10 crc kubenswrapper[4707]: I0129 04:41:10.955906 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/controller/0.log" Jan 29 04:41:11 crc kubenswrapper[4707]: I0129 04:41:11.001607 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/cp-metrics/0.log" Jan 29 04:41:11 crc kubenswrapper[4707]: I0129 04:41:11.202322 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/kube-rbac-proxy/0.log" Jan 29 04:41:11 crc kubenswrapper[4707]: I0129 04:41:11.216745 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/frr-metrics/0.log" Jan 29 04:41:11 crc kubenswrapper[4707]: I0129 04:41:11.237289 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/kube-rbac-proxy-frr/0.log" Jan 29 04:41:11 crc kubenswrapper[4707]: I0129 04:41:11.503805 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/reloader/0.log" Jan 29 04:41:11 crc kubenswrapper[4707]: I0129 04:41:11.527062 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-4vtxc_a43d3509-ef8d-47ba-b60f-675e3113086d/frr-k8s-webhook-server/0.log" Jan 29 04:41:11 crc kubenswrapper[4707]: I0129 04:41:11.772640 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76498b594b-4z2xj_403318b7-a0b4-4a62-8094-9a2ac1127387/manager/0.log" Jan 29 04:41:12 crc kubenswrapper[4707]: I0129 04:41:12.026041 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2frrd_1d738b61-6875-468d-8fdb-c0d567c8ea88/kube-rbac-proxy/0.log" Jan 29 04:41:12 crc kubenswrapper[4707]: I0129 04:41:12.027868 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-866946c6ff-xnqlj_126a1c1c-acae-407d-854d-fbeb74a88a9c/webhook-server/0.log" Jan 29 04:41:12 crc kubenswrapper[4707]: I0129 04:41:12.803368 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2frrd_1d738b61-6875-468d-8fdb-c0d567c8ea88/speaker/0.log" Jan 29 04:41:12 crc kubenswrapper[4707]: I0129 04:41:12.919594 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b72sz_a50001d5-1baf-4746-aa06-afa2a7853541/frr/0.log" Jan 29 04:41:19 crc kubenswrapper[4707]: I0129 04:41:19.262869 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:41:19 crc kubenswrapper[4707]: E0129 04:41:19.264118 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:41:28 crc kubenswrapper[4707]: I0129 04:41:28.982469 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/util/0.log" Jan 29 04:41:29 crc kubenswrapper[4707]: I0129 04:41:29.600652 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/util/0.log" Jan 29 04:41:29 crc kubenswrapper[4707]: I0129 04:41:29.619432 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/pull/0.log" Jan 29 04:41:29 crc kubenswrapper[4707]: I0129 04:41:29.623295 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/pull/0.log" Jan 29 04:41:29 crc kubenswrapper[4707]: I0129 04:41:29.787360 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/util/0.log" Jan 29 04:41:29 crc kubenswrapper[4707]: I0129 04:41:29.789493 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/extract/0.log" Jan 29 04:41:29 crc kubenswrapper[4707]: I0129 04:41:29.795388 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchlhrs_814f6017-596c-4a1b-87c6-78a2d013cec2/pull/0.log" Jan 29 04:41:29 crc kubenswrapper[4707]: I0129 04:41:29.961659 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/util/0.log" Jan 29 04:41:30 crc kubenswrapper[4707]: I0129 04:41:30.161576 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/pull/0.log" Jan 29 04:41:30 crc kubenswrapper[4707]: I0129 04:41:30.229400 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/util/0.log" Jan 29 04:41:30 crc kubenswrapper[4707]: I0129 04:41:30.242019 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/pull/0.log" Jan 29 04:41:30 crc kubenswrapper[4707]: I0129 04:41:30.244055 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:41:30 crc kubenswrapper[4707]: E0129 04:41:30.244386 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:41:30 crc kubenswrapper[4707]: I0129 04:41:30.441022 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/pull/0.log" Jan 29 04:41:30 crc kubenswrapper[4707]: I0129 04:41:30.474184 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/extract/0.log" Jan 29 04:41:30 crc kubenswrapper[4707]: I0129 04:41:30.712412 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713lv4kn_326126f2-a0ee-40f0-9bf9-82dc8f430539/util/0.log" Jan 29 04:41:30 crc kubenswrapper[4707]: I0129 04:41:30.866085 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/util/0.log" Jan 29 04:41:31 crc kubenswrapper[4707]: I0129 04:41:31.650168 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/pull/0.log" Jan 29 04:41:31 crc kubenswrapper[4707]: I0129 04:41:31.657495 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/pull/0.log" Jan 29 04:41:31 crc kubenswrapper[4707]: I0129 04:41:31.666814 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/util/0.log" Jan 29 04:41:31 crc kubenswrapper[4707]: I0129 04:41:31.866420 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/util/0.log" Jan 29 04:41:31 crc kubenswrapper[4707]: I0129 04:41:31.886783 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/pull/0.log" Jan 29 04:41:31 crc kubenswrapper[4707]: I0129 04:41:31.899828 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p59hc_92ea08d2-9b03-4237-8606-3ce08e97a0a3/extract/0.log" Jan 29 04:41:32 crc kubenswrapper[4707]: I0129 04:41:32.056976 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-utilities/0.log" Jan 29 04:41:32 crc kubenswrapper[4707]: I0129 04:41:32.207240 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-utilities/0.log" Jan 29 04:41:32 crc kubenswrapper[4707]: I0129 04:41:32.230791 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-content/0.log" Jan 29 04:41:32 crc kubenswrapper[4707]: I0129 04:41:32.237160 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-content/0.log" Jan 29 04:41:32 crc kubenswrapper[4707]: I0129 04:41:32.466663 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-content/0.log" Jan 29 04:41:32 crc kubenswrapper[4707]: I0129 04:41:32.497270 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/extract-utilities/0.log" Jan 29 04:41:32 crc kubenswrapper[4707]: I0129 04:41:32.747762 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-utilities/0.log" Jan 29 04:41:32 crc kubenswrapper[4707]: I0129 04:41:32.963344 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2wrp2_87614809-814f-41aa-a98f-8d06b5875cd7/registry-server/0.log" Jan 29 04:41:32 crc kubenswrapper[4707]: I0129 04:41:32.968937 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-utilities/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.033126 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-content/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.048225 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-content/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.238075 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-utilities/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.240462 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/extract-content/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.317087 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qwgfr_d3599142-c844-4a86-9bef-e589d69f0ef4/marketplace-operator/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.519878 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-utilities/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.864431 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-utilities/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.888320 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-content/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.888566 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-content/0.log" Jan 29 04:41:33 crc kubenswrapper[4707]: I0129 04:41:33.978059 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ndnr5_a99f8ef9-ec05-437a-aec4-e7d7bb669485/registry-server/0.log" Jan 29 04:41:34 crc kubenswrapper[4707]: I0129 04:41:34.118828 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-content/0.log" Jan 29 04:41:34 crc kubenswrapper[4707]: I0129 04:41:34.151224 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/extract-utilities/0.log" Jan 29 04:41:34 crc kubenswrapper[4707]: I0129 04:41:34.227781 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-utilities/0.log" Jan 29 04:41:34 crc kubenswrapper[4707]: I0129 04:41:34.268397 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fhkcz_9ae25a3b-cb3e-4ea6-9373-31d52ee5dda2/registry-server/0.log" Jan 29 04:41:34 crc kubenswrapper[4707]: I0129 04:41:34.687883 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-content/0.log" Jan 29 04:41:34 crc kubenswrapper[4707]: I0129 04:41:34.714984 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-utilities/0.log" Jan 29 04:41:34 crc kubenswrapper[4707]: I0129 04:41:34.719614 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-content/0.log" Jan 29 04:41:34 crc kubenswrapper[4707]: I0129 04:41:34.896101 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-utilities/0.log" Jan 29 04:41:34 crc kubenswrapper[4707]: I0129 04:41:34.902119 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/extract-content/0.log" Jan 29 04:41:35 crc kubenswrapper[4707]: I0129 04:41:35.465883 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cb77w_0b4a55f4-b7e6-454e-a8f7-066ce8edb801/registry-server/0.log" Jan 29 04:41:45 crc kubenswrapper[4707]: I0129 04:41:45.243813 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:41:45 crc kubenswrapper[4707]: E0129 04:41:45.244625 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:41:51 crc kubenswrapper[4707]: I0129 04:41:51.885446 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-zggfz_aac893f7-df17-486c-895f-b5305b76bc60/prometheus-operator/0.log" Jan 29 04:41:51 crc kubenswrapper[4707]: I0129 04:41:51.890771 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd586f795-dhgdl_23eb4701-1c82-40e4-990b-87c4044f51cc/prometheus-operator-admission-webhook/0.log" Jan 29 04:41:51 crc kubenswrapper[4707]: I0129 04:41:51.924384 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd586f795-qmjjg_fd346f51-b69b-4ac8-b4d3-d24201dd0015/prometheus-operator-admission-webhook/0.log" Jan 29 04:41:52 crc kubenswrapper[4707]: I0129 04:41:52.215386 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-gkwm8_cb2589dc-26af-42a0-8fb9-8f908a0fbac9/operator/0.log" Jan 29 04:41:52 crc kubenswrapper[4707]: I0129 04:41:52.376774 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-b6clh_1ab8ee44-5e15-42f6-861c-071cb82c90d4/perses-operator/0.log" Jan 29 04:41:56 crc kubenswrapper[4707]: E0129 04:41:56.780863 4707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.204:34770->38.102.83.204:39253: write tcp 38.102.83.204:34770->38.102.83.204:39253: write: broken pipe Jan 29 04:41:57 crc kubenswrapper[4707]: I0129 04:41:57.254015 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:41:57 crc kubenswrapper[4707]: E0129 04:41:57.254773 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.438399 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hzjxx"] Jan 29 04:42:04 crc kubenswrapper[4707]: E0129 04:42:04.439260 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b820f4-a051-4ee1-9aab-9ed782eedd01" containerName="container-00" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.439274 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b820f4-a051-4ee1-9aab-9ed782eedd01" containerName="container-00" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.439486 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b820f4-a051-4ee1-9aab-9ed782eedd01" containerName="container-00" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.441381 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.459137 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzjxx"] Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.594925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9fz4\" (UniqueName: \"kubernetes.io/projected/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-kube-api-access-h9fz4\") pod \"redhat-marketplace-hzjxx\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.595040 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-catalog-content\") pod \"redhat-marketplace-hzjxx\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.595331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-utilities\") pod \"redhat-marketplace-hzjxx\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.697381 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-utilities\") pod \"redhat-marketplace-hzjxx\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.697834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-utilities\") pod \"redhat-marketplace-hzjxx\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.698037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9fz4\" (UniqueName: \"kubernetes.io/projected/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-kube-api-access-h9fz4\") pod \"redhat-marketplace-hzjxx\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.698208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-catalog-content\") pod \"redhat-marketplace-hzjxx\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.698743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-catalog-content\") pod \"redhat-marketplace-hzjxx\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:04 crc kubenswrapper[4707]: I0129 04:42:04.818300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9fz4\" (UniqueName: \"kubernetes.io/projected/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-kube-api-access-h9fz4\") pod \"redhat-marketplace-hzjxx\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:05 crc kubenswrapper[4707]: I0129 04:42:05.061586 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:05 crc kubenswrapper[4707]: I0129 04:42:05.547814 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzjxx"] Jan 29 04:42:05 crc kubenswrapper[4707]: I0129 04:42:05.964895 4707 generic.go:334] "Generic (PLEG): container finished" podID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerID="018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff" exitCode=0 Jan 29 04:42:05 crc kubenswrapper[4707]: I0129 04:42:05.965304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzjxx" event={"ID":"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60","Type":"ContainerDied","Data":"018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff"} Jan 29 04:42:05 crc kubenswrapper[4707]: I0129 04:42:05.965371 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzjxx" event={"ID":"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60","Type":"ContainerStarted","Data":"40c083458b7a2c9cfc059575aef6e5f5251b521746d85902ac84acb103a3b732"} Jan 29 04:42:05 crc kubenswrapper[4707]: I0129 04:42:05.968269 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 04:42:06 crc kubenswrapper[4707]: I0129 04:42:06.977577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzjxx" event={"ID":"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60","Type":"ContainerStarted","Data":"3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b"} Jan 29 04:42:07 crc kubenswrapper[4707]: I0129 04:42:07.989531 4707 generic.go:334] "Generic (PLEG): container finished" podID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerID="3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b" exitCode=0 Jan 29 04:42:07 crc kubenswrapper[4707]: I0129 04:42:07.989648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzjxx" event={"ID":"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60","Type":"ContainerDied","Data":"3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b"} Jan 29 04:42:09 crc kubenswrapper[4707]: I0129 04:42:09.005667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzjxx" event={"ID":"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60","Type":"ContainerStarted","Data":"d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04"} Jan 29 04:42:10 crc kubenswrapper[4707]: I0129 04:42:10.244211 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:42:10 crc kubenswrapper[4707]: E0129 04:42:10.245285 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:42:15 crc kubenswrapper[4707]: I0129 04:42:15.062044 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:15 crc kubenswrapper[4707]: I0129 04:42:15.062315 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:15 crc kubenswrapper[4707]: I0129 04:42:15.111437 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:15 crc kubenswrapper[4707]: I0129 04:42:15.134214 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hzjxx" podStartSLOduration=8.697844645 podStartE2EDuration="11.134194218s" podCreationTimestamp="2026-01-29 04:42:04 +0000 UTC" firstStartedPulling="2026-01-29 04:42:05.967955818 +0000 UTC m=+4479.452184723" lastFinishedPulling="2026-01-29 04:42:08.404305391 +0000 UTC m=+4481.888534296" observedRunningTime="2026-01-29 04:42:09.025963564 +0000 UTC m=+4482.510192469" watchObservedRunningTime="2026-01-29 04:42:15.134194218 +0000 UTC m=+4488.618423143" Jan 29 04:42:16 crc kubenswrapper[4707]: I0129 04:42:16.183561 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:16 crc kubenswrapper[4707]: I0129 04:42:16.266395 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzjxx"] Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.101749 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hzjxx" podUID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerName="registry-server" containerID="cri-o://d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04" gracePeriod=2 Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.623580 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.749712 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-utilities\") pod \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.750291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9fz4\" (UniqueName: \"kubernetes.io/projected/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-kube-api-access-h9fz4\") pod \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.750331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-catalog-content\") pod \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\" (UID: \"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60\") " Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.751222 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-utilities" (OuterVolumeSpecName: "utilities") pod "15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" (UID: "15dbbd97-cbf2-4cb1-901f-01fbc63c6e60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.762295 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-kube-api-access-h9fz4" (OuterVolumeSpecName: "kube-api-access-h9fz4") pod "15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" (UID: "15dbbd97-cbf2-4cb1-901f-01fbc63c6e60"). InnerVolumeSpecName "kube-api-access-h9fz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.773766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" (UID: "15dbbd97-cbf2-4cb1-901f-01fbc63c6e60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.852885 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.852920 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9fz4\" (UniqueName: \"kubernetes.io/projected/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-kube-api-access-h9fz4\") on node \"crc\" DevicePath \"\"" Jan 29 04:42:18 crc kubenswrapper[4707]: I0129 04:42:18.852931 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.114717 4707 generic.go:334] "Generic (PLEG): container finished" podID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerID="d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04" exitCode=0 Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.114775 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzjxx" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.114789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzjxx" event={"ID":"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60","Type":"ContainerDied","Data":"d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04"} Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.114846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzjxx" event={"ID":"15dbbd97-cbf2-4cb1-901f-01fbc63c6e60","Type":"ContainerDied","Data":"40c083458b7a2c9cfc059575aef6e5f5251b521746d85902ac84acb103a3b732"} Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.114867 4707 scope.go:117] "RemoveContainer" containerID="d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.146926 4707 scope.go:117] "RemoveContainer" containerID="3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.173237 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzjxx"] Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.183670 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzjxx"] Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.199370 4707 scope.go:117] "RemoveContainer" containerID="018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.231219 4707 scope.go:117] "RemoveContainer" containerID="d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04" Jan 29 04:42:19 crc kubenswrapper[4707]: E0129 04:42:19.231712 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04\": container with ID starting with d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04 not found: ID does not exist" containerID="d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.231751 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04"} err="failed to get container status \"d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04\": rpc error: code = NotFound desc = could not find container \"d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04\": container with ID starting with d0d02b566e6139e36b16d66aad097044c35664ffd01e71ba1aaaec948dcbcf04 not found: ID does not exist" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.231781 4707 scope.go:117] "RemoveContainer" containerID="3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b" Jan 29 04:42:19 crc kubenswrapper[4707]: E0129 04:42:19.232061 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b\": container with ID starting with 3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b not found: ID does not exist" containerID="3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.232093 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b"} err="failed to get container status \"3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b\": rpc error: code = NotFound desc = could not find container \"3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b\": container with ID starting with 3172aeb312568fe5fbf13eda8bf022a9340e3ec3450971627022f238b8b0b88b not found: ID does not exist" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.232112 4707 scope.go:117] "RemoveContainer" containerID="018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff" Jan 29 04:42:19 crc kubenswrapper[4707]: E0129 04:42:19.232346 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff\": container with ID starting with 018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff not found: ID does not exist" containerID="018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.232370 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff"} err="failed to get container status \"018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff\": rpc error: code = NotFound desc = could not find container \"018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff\": container with ID starting with 018cf21864a598e9fc274419dc3b3eb1689f9eb153313ab2dc9cb26b366c9bff not found: ID does not exist" Jan 29 04:42:19 crc kubenswrapper[4707]: I0129 04:42:19.255557 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" path="/var/lib/kubelet/pods/15dbbd97-cbf2-4cb1-901f-01fbc63c6e60/volumes" Jan 29 04:42:25 crc kubenswrapper[4707]: I0129 04:42:25.243733 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:42:25 crc kubenswrapper[4707]: E0129 04:42:25.244629 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:42:37 crc kubenswrapper[4707]: I0129 04:42:37.278092 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:42:37 crc kubenswrapper[4707]: E0129 04:42:37.279489 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:42:49 crc kubenswrapper[4707]: I0129 04:42:49.243969 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:42:49 crc kubenswrapper[4707]: E0129 04:42:49.245305 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:43:01 crc kubenswrapper[4707]: I0129 04:43:01.244298 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:43:01 crc kubenswrapper[4707]: E0129 04:43:01.246830 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:43:12 crc kubenswrapper[4707]: I0129 04:43:12.243856 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:43:12 crc kubenswrapper[4707]: E0129 04:43:12.245095 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:43:27 crc kubenswrapper[4707]: I0129 04:43:27.265045 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:43:27 crc kubenswrapper[4707]: E0129 04:43:27.266471 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:43:31 crc kubenswrapper[4707]: I0129 04:43:31.966870 4707 generic.go:334] "Generic (PLEG): container finished" podID="6bf87d95-c954-47d1-a325-b0ab708edcdb" containerID="cdc0823e923de4735f44842004cebdbce192d0c94b646cf4743249b3dfe06a7e" exitCode=0 Jan 29 04:43:31 crc kubenswrapper[4707]: I0129 04:43:31.966880 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qnn84/must-gather-8wm7f" event={"ID":"6bf87d95-c954-47d1-a325-b0ab708edcdb","Type":"ContainerDied","Data":"cdc0823e923de4735f44842004cebdbce192d0c94b646cf4743249b3dfe06a7e"} Jan 29 04:43:31 crc kubenswrapper[4707]: I0129 04:43:31.968584 4707 scope.go:117] "RemoveContainer" containerID="cdc0823e923de4735f44842004cebdbce192d0c94b646cf4743249b3dfe06a7e" Jan 29 04:43:32 crc kubenswrapper[4707]: I0129 04:43:32.074342 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qnn84_must-gather-8wm7f_6bf87d95-c954-47d1-a325-b0ab708edcdb/gather/0.log" Jan 29 04:43:42 crc kubenswrapper[4707]: I0129 04:43:42.243639 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:43:42 crc kubenswrapper[4707]: E0129 04:43:42.244474 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.667853 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-86kgq"] Jan 29 04:43:43 crc kubenswrapper[4707]: E0129 04:43:43.669161 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerName="extract-utilities" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.669183 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerName="extract-utilities" Jan 29 04:43:43 crc kubenswrapper[4707]: E0129 04:43:43.669201 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerName="registry-server" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.669208 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerName="registry-server" Jan 29 04:43:43 crc kubenswrapper[4707]: E0129 04:43:43.669232 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerName="extract-content" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.669239 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerName="extract-content" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.669423 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="15dbbd97-cbf2-4cb1-901f-01fbc63c6e60" containerName="registry-server" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.671568 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.679943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86kgq"] Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.797111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvdgs\" (UniqueName: \"kubernetes.io/projected/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-kube-api-access-mvdgs\") pod \"redhat-operators-86kgq\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.797550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-utilities\") pod \"redhat-operators-86kgq\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.797579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-catalog-content\") pod \"redhat-operators-86kgq\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.899724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-utilities\") pod \"redhat-operators-86kgq\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.899777 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-catalog-content\") pod \"redhat-operators-86kgq\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.899877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvdgs\" (UniqueName: \"kubernetes.io/projected/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-kube-api-access-mvdgs\") pod \"redhat-operators-86kgq\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.900366 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-utilities\") pod \"redhat-operators-86kgq\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.900419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-catalog-content\") pod \"redhat-operators-86kgq\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.931464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvdgs\" (UniqueName: \"kubernetes.io/projected/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-kube-api-access-mvdgs\") pod \"redhat-operators-86kgq\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:43 crc kubenswrapper[4707]: I0129 04:43:43.991507 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:44 crc kubenswrapper[4707]: I0129 04:43:44.429783 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qnn84/must-gather-8wm7f"] Jan 29 04:43:44 crc kubenswrapper[4707]: I0129 04:43:44.430371 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qnn84/must-gather-8wm7f" podUID="6bf87d95-c954-47d1-a325-b0ab708edcdb" containerName="copy" containerID="cri-o://736c2cf5d70a2c41c8e7afb743adbd2a1c0aac1e9d91f51cd8f2ae911856fec0" gracePeriod=2 Jan 29 04:43:44 crc kubenswrapper[4707]: I0129 04:43:44.449287 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qnn84/must-gather-8wm7f"] Jan 29 04:43:44 crc kubenswrapper[4707]: I0129 04:43:44.479341 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86kgq"] Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.115181 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qnn84_must-gather-8wm7f_6bf87d95-c954-47d1-a325-b0ab708edcdb/copy/0.log" Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.116795 4707 generic.go:334] "Generic (PLEG): container finished" podID="6bf87d95-c954-47d1-a325-b0ab708edcdb" containerID="736c2cf5d70a2c41c8e7afb743adbd2a1c0aac1e9d91f51cd8f2ae911856fec0" exitCode=143 Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.116971 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="872581ded827f822fea70b1a70739a194131f3fa0a817ab29f654f132d87d937" Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.118839 4707 generic.go:334] "Generic (PLEG): container finished" podID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerID="6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58" exitCode=0 Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.119002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kgq" event={"ID":"dcf530c1-fed7-411c-bbbf-ebed9da95ca2","Type":"ContainerDied","Data":"6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58"} Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.119077 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kgq" event={"ID":"dcf530c1-fed7-411c-bbbf-ebed9da95ca2","Type":"ContainerStarted","Data":"9d5a7fb6a9d89ed9d4976efa10c080f3341b1a7e18d5e89e96960a8dedb27aad"} Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.148580 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qnn84_must-gather-8wm7f_6bf87d95-c954-47d1-a325-b0ab708edcdb/copy/0.log" Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.149592 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.242417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf8nq\" (UniqueName: \"kubernetes.io/projected/6bf87d95-c954-47d1-a325-b0ab708edcdb-kube-api-access-qf8nq\") pod \"6bf87d95-c954-47d1-a325-b0ab708edcdb\" (UID: \"6bf87d95-c954-47d1-a325-b0ab708edcdb\") " Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.242675 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bf87d95-c954-47d1-a325-b0ab708edcdb-must-gather-output\") pod \"6bf87d95-c954-47d1-a325-b0ab708edcdb\" (UID: \"6bf87d95-c954-47d1-a325-b0ab708edcdb\") " Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.256857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf87d95-c954-47d1-a325-b0ab708edcdb-kube-api-access-qf8nq" (OuterVolumeSpecName: "kube-api-access-qf8nq") pod "6bf87d95-c954-47d1-a325-b0ab708edcdb" (UID: "6bf87d95-c954-47d1-a325-b0ab708edcdb"). InnerVolumeSpecName "kube-api-access-qf8nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.344498 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf8nq\" (UniqueName: \"kubernetes.io/projected/6bf87d95-c954-47d1-a325-b0ab708edcdb-kube-api-access-qf8nq\") on node \"crc\" DevicePath \"\"" Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.434874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bf87d95-c954-47d1-a325-b0ab708edcdb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6bf87d95-c954-47d1-a325-b0ab708edcdb" (UID: "6bf87d95-c954-47d1-a325-b0ab708edcdb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:43:45 crc kubenswrapper[4707]: I0129 04:43:45.446000 4707 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6bf87d95-c954-47d1-a325-b0ab708edcdb-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 29 04:43:46 crc kubenswrapper[4707]: I0129 04:43:46.133586 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qnn84/must-gather-8wm7f" Jan 29 04:43:46 crc kubenswrapper[4707]: I0129 04:43:46.133593 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kgq" event={"ID":"dcf530c1-fed7-411c-bbbf-ebed9da95ca2","Type":"ContainerStarted","Data":"8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753"} Jan 29 04:43:47 crc kubenswrapper[4707]: I0129 04:43:47.254845 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf87d95-c954-47d1-a325-b0ab708edcdb" path="/var/lib/kubelet/pods/6bf87d95-c954-47d1-a325-b0ab708edcdb/volumes" Jan 29 04:43:50 crc kubenswrapper[4707]: I0129 04:43:50.184971 4707 generic.go:334] "Generic (PLEG): container finished" podID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerID="8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753" exitCode=0 Jan 29 04:43:50 crc kubenswrapper[4707]: I0129 04:43:50.185067 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kgq" event={"ID":"dcf530c1-fed7-411c-bbbf-ebed9da95ca2","Type":"ContainerDied","Data":"8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753"} Jan 29 04:43:51 crc kubenswrapper[4707]: I0129 04:43:51.201509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kgq" event={"ID":"dcf530c1-fed7-411c-bbbf-ebed9da95ca2","Type":"ContainerStarted","Data":"8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75"} Jan 29 04:43:51 crc kubenswrapper[4707]: I0129 04:43:51.227673 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-86kgq" podStartSLOduration=2.746784638 podStartE2EDuration="8.227642078s" podCreationTimestamp="2026-01-29 04:43:43 +0000 UTC" firstStartedPulling="2026-01-29 04:43:45.121638647 +0000 UTC m=+4578.605867552" lastFinishedPulling="2026-01-29 04:43:50.602496047 +0000 UTC m=+4584.086724992" observedRunningTime="2026-01-29 04:43:51.226688621 +0000 UTC m=+4584.710917526" watchObservedRunningTime="2026-01-29 04:43:51.227642078 +0000 UTC m=+4584.711870983" Jan 29 04:43:53 crc kubenswrapper[4707]: I0129 04:43:53.992384 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:53 crc kubenswrapper[4707]: I0129 04:43:53.992907 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:43:55 crc kubenswrapper[4707]: I0129 04:43:55.056951 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-86kgq" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerName="registry-server" probeResult="failure" output=< Jan 29 04:43:55 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Jan 29 04:43:55 crc kubenswrapper[4707]: > Jan 29 04:43:56 crc kubenswrapper[4707]: I0129 04:43:56.244601 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:43:56 crc kubenswrapper[4707]: E0129 04:43:56.245152 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hbz9l_openshift-machine-config-operator(df12d101-b13d-4276-94b7-422c6609d2e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" Jan 29 04:44:04 crc kubenswrapper[4707]: I0129 04:44:04.038967 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:44:04 crc kubenswrapper[4707]: I0129 04:44:04.090944 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:44:04 crc kubenswrapper[4707]: I0129 04:44:04.296899 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86kgq"] Jan 29 04:44:05 crc kubenswrapper[4707]: I0129 04:44:05.339403 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-86kgq" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerName="registry-server" containerID="cri-o://8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75" gracePeriod=2 Jan 29 04:44:05 crc kubenswrapper[4707]: I0129 04:44:05.932094 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.057694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvdgs\" (UniqueName: \"kubernetes.io/projected/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-kube-api-access-mvdgs\") pod \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.057820 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-utilities\") pod \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.057914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-catalog-content\") pod \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\" (UID: \"dcf530c1-fed7-411c-bbbf-ebed9da95ca2\") " Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.058879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-utilities" (OuterVolumeSpecName: "utilities") pod "dcf530c1-fed7-411c-bbbf-ebed9da95ca2" (UID: "dcf530c1-fed7-411c-bbbf-ebed9da95ca2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.065069 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-kube-api-access-mvdgs" (OuterVolumeSpecName: "kube-api-access-mvdgs") pod "dcf530c1-fed7-411c-bbbf-ebed9da95ca2" (UID: "dcf530c1-fed7-411c-bbbf-ebed9da95ca2"). InnerVolumeSpecName "kube-api-access-mvdgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.161215 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvdgs\" (UniqueName: \"kubernetes.io/projected/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-kube-api-access-mvdgs\") on node \"crc\" DevicePath \"\"" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.161483 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.189021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcf530c1-fed7-411c-bbbf-ebed9da95ca2" (UID: "dcf530c1-fed7-411c-bbbf-ebed9da95ca2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.263410 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf530c1-fed7-411c-bbbf-ebed9da95ca2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.351675 4707 generic.go:334] "Generic (PLEG): container finished" podID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerID="8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75" exitCode=0 Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.351730 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kgq" event={"ID":"dcf530c1-fed7-411c-bbbf-ebed9da95ca2","Type":"ContainerDied","Data":"8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75"} Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.351743 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86kgq" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.351761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kgq" event={"ID":"dcf530c1-fed7-411c-bbbf-ebed9da95ca2","Type":"ContainerDied","Data":"9d5a7fb6a9d89ed9d4976efa10c080f3341b1a7e18d5e89e96960a8dedb27aad"} Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.351779 4707 scope.go:117] "RemoveContainer" containerID="8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.371858 4707 scope.go:117] "RemoveContainer" containerID="8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.404967 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86kgq"] Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.407784 4707 scope.go:117] "RemoveContainer" containerID="6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.414355 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-86kgq"] Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.443362 4707 scope.go:117] "RemoveContainer" containerID="8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75" Jan 29 04:44:06 crc kubenswrapper[4707]: E0129 04:44:06.443818 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75\": container with ID starting with 8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75 not found: ID does not exist" containerID="8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.443848 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75"} err="failed to get container status \"8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75\": rpc error: code = NotFound desc = could not find container \"8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75\": container with ID starting with 8f0244ae3f4bcd927148aaaad8198754479fdabe97f08c6ef4caf4714d1d9d75 not found: ID does not exist" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.443868 4707 scope.go:117] "RemoveContainer" containerID="8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753" Jan 29 04:44:06 crc kubenswrapper[4707]: E0129 04:44:06.444254 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753\": container with ID starting with 8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753 not found: ID does not exist" containerID="8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.444280 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753"} err="failed to get container status \"8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753\": rpc error: code = NotFound desc = could not find container \"8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753\": container with ID starting with 8c90107777c38b10e936575b2e92005309780b61fca89806d8914459df830753 not found: ID does not exist" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.444295 4707 scope.go:117] "RemoveContainer" containerID="6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58" Jan 29 04:44:06 crc kubenswrapper[4707]: E0129 04:44:06.444579 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58\": container with ID starting with 6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58 not found: ID does not exist" containerID="6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58" Jan 29 04:44:06 crc kubenswrapper[4707]: I0129 04:44:06.444602 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58"} err="failed to get container status \"6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58\": rpc error: code = NotFound desc = could not find container \"6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58\": container with ID starting with 6d51acb48ef8369943b95b8f3e5e292faf49d8d94c9ff44b48557fd8f58dfd58 not found: ID does not exist" Jan 29 04:44:07 crc kubenswrapper[4707]: I0129 04:44:07.257470 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" path="/var/lib/kubelet/pods/dcf530c1-fed7-411c-bbbf-ebed9da95ca2/volumes" Jan 29 04:44:09 crc kubenswrapper[4707]: I0129 04:44:09.243508 4707 scope.go:117] "RemoveContainer" containerID="4c69684963e05464b1459c3048565a7c1242d5cdf05f3679f94e633ffd8d0ee6" Jan 29 04:44:10 crc kubenswrapper[4707]: I0129 04:44:10.402066 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" event={"ID":"df12d101-b13d-4276-94b7-422c6609d2e8","Type":"ContainerStarted","Data":"a5df08e79b48c9b6c86392ea349adeb5b9db0ef6b8f7fd9bae2f9e3e7a0a85d6"} Jan 29 04:44:42 crc kubenswrapper[4707]: I0129 04:44:42.127009 4707 scope.go:117] "RemoveContainer" containerID="736c2cf5d70a2c41c8e7afb743adbd2a1c0aac1e9d91f51cd8f2ae911856fec0" Jan 29 04:44:42 crc kubenswrapper[4707]: I0129 04:44:42.172738 4707 scope.go:117] "RemoveContainer" containerID="cdc0823e923de4735f44842004cebdbce192d0c94b646cf4743249b3dfe06a7e" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.396238 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4rfvr"] Jan 29 04:44:59 crc kubenswrapper[4707]: E0129 04:44:59.398375 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerName="registry-server" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.398473 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerName="registry-server" Jan 29 04:44:59 crc kubenswrapper[4707]: E0129 04:44:59.398574 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf87d95-c954-47d1-a325-b0ab708edcdb" containerName="copy" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.398646 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf87d95-c954-47d1-a325-b0ab708edcdb" containerName="copy" Jan 29 04:44:59 crc kubenswrapper[4707]: E0129 04:44:59.398709 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf87d95-c954-47d1-a325-b0ab708edcdb" containerName="gather" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.398764 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf87d95-c954-47d1-a325-b0ab708edcdb" containerName="gather" Jan 29 04:44:59 crc kubenswrapper[4707]: E0129 04:44:59.398872 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerName="extract-utilities" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.398936 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerName="extract-utilities" Jan 29 04:44:59 crc kubenswrapper[4707]: E0129 04:44:59.399005 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerName="extract-content" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.399060 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerName="extract-content" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.399646 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf87d95-c954-47d1-a325-b0ab708edcdb" containerName="copy" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.399749 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf530c1-fed7-411c-bbbf-ebed9da95ca2" containerName="registry-server" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.399821 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf87d95-c954-47d1-a325-b0ab708edcdb" containerName="gather" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.401435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.409943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rfvr"] Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.583333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-utilities\") pod \"certified-operators-4rfvr\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.583436 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bs2v\" (UniqueName: \"kubernetes.io/projected/fcd795c6-708f-47ba-8dc7-faf254011adb-kube-api-access-8bs2v\") pod \"certified-operators-4rfvr\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.583638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-catalog-content\") pod \"certified-operators-4rfvr\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.686378 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bs2v\" (UniqueName: \"kubernetes.io/projected/fcd795c6-708f-47ba-8dc7-faf254011adb-kube-api-access-8bs2v\") pod \"certified-operators-4rfvr\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.686635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-catalog-content\") pod \"certified-operators-4rfvr\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.686723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-utilities\") pod \"certified-operators-4rfvr\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.687309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-catalog-content\") pod \"certified-operators-4rfvr\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.687456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-utilities\") pod \"certified-operators-4rfvr\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.711006 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bs2v\" (UniqueName: \"kubernetes.io/projected/fcd795c6-708f-47ba-8dc7-faf254011adb-kube-api-access-8bs2v\") pod \"certified-operators-4rfvr\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:44:59 crc kubenswrapper[4707]: I0129 04:44:59.739835 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.191138 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr"] Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.193114 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.196284 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.196892 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.217290 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr"] Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.302990 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-secret-volume\") pod \"collect-profiles-29494365-zw2qr\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.303064 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-config-volume\") pod \"collect-profiles-29494365-zw2qr\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.303403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5bzp\" (UniqueName: \"kubernetes.io/projected/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-kube-api-access-v5bzp\") pod \"collect-profiles-29494365-zw2qr\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.326618 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rfvr"] Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.405554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-secret-volume\") pod \"collect-profiles-29494365-zw2qr\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.406049 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-config-volume\") pod \"collect-profiles-29494365-zw2qr\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.406250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5bzp\" (UniqueName: \"kubernetes.io/projected/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-kube-api-access-v5bzp\") pod \"collect-profiles-29494365-zw2qr\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.408649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-config-volume\") pod \"collect-profiles-29494365-zw2qr\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.415016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-secret-volume\") pod \"collect-profiles-29494365-zw2qr\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.424259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5bzp\" (UniqueName: \"kubernetes.io/projected/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-kube-api-access-v5bzp\") pod \"collect-profiles-29494365-zw2qr\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.515418 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.849526 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr"] Jan 29 04:45:00 crc kubenswrapper[4707]: W0129 04:45:00.855874 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f3a9887_76e9_4f98_aeb4_3c8e266f4bf4.slice/crio-aa285d2c1cad21cb39650884aac5df4eaa556fa3780a631866b0f9f55264a5e3 WatchSource:0}: Error finding container aa285d2c1cad21cb39650884aac5df4eaa556fa3780a631866b0f9f55264a5e3: Status 404 returned error can't find the container with id aa285d2c1cad21cb39650884aac5df4eaa556fa3780a631866b0f9f55264a5e3 Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.961144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" event={"ID":"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4","Type":"ContainerStarted","Data":"aa285d2c1cad21cb39650884aac5df4eaa556fa3780a631866b0f9f55264a5e3"} Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.963817 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcd795c6-708f-47ba-8dc7-faf254011adb" containerID="336b70698438ae176c38d9ff3c1678ecba13c0bb5f7cb13f5949c84e0214f439" exitCode=0 Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.963851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rfvr" event={"ID":"fcd795c6-708f-47ba-8dc7-faf254011adb","Type":"ContainerDied","Data":"336b70698438ae176c38d9ff3c1678ecba13c0bb5f7cb13f5949c84e0214f439"} Jan 29 04:45:00 crc kubenswrapper[4707]: I0129 04:45:00.963875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rfvr" event={"ID":"fcd795c6-708f-47ba-8dc7-faf254011adb","Type":"ContainerStarted","Data":"c0628da78c2af691e2d8e8669d97545ab0ce74b9f9ff6f87e60db97dd27bcca1"} Jan 29 04:45:01 crc kubenswrapper[4707]: I0129 04:45:01.980190 4707 generic.go:334] "Generic (PLEG): container finished" podID="8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4" containerID="73c01a0c65b51b12bbb47f235ff8a82b612608ac33f7553283b84c5faa8f4030" exitCode=0 Jan 29 04:45:01 crc kubenswrapper[4707]: I0129 04:45:01.980271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" event={"ID":"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4","Type":"ContainerDied","Data":"73c01a0c65b51b12bbb47f235ff8a82b612608ac33f7553283b84c5faa8f4030"} Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.002373 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rfvr" event={"ID":"fcd795c6-708f-47ba-8dc7-faf254011adb","Type":"ContainerStarted","Data":"f491a97f3588be239c4c5b220ec2d9ea4a68032e7052d32b99d0579038c4efca"} Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.386036 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.503502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5bzp\" (UniqueName: \"kubernetes.io/projected/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-kube-api-access-v5bzp\") pod \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.503715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-config-volume\") pod \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.503911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-secret-volume\") pod \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\" (UID: \"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4\") " Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.504847 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-config-volume" (OuterVolumeSpecName: "config-volume") pod "8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4" (UID: "8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.514363 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4" (UID: "8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.515771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-kube-api-access-v5bzp" (OuterVolumeSpecName: "kube-api-access-v5bzp") pod "8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4" (UID: "8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4"). InnerVolumeSpecName "kube-api-access-v5bzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.607348 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.607383 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5bzp\" (UniqueName: \"kubernetes.io/projected/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-kube-api-access-v5bzp\") on node \"crc\" DevicePath \"\"" Jan 29 04:45:03 crc kubenswrapper[4707]: I0129 04:45:03.607397 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 04:45:04 crc kubenswrapper[4707]: I0129 04:45:04.014704 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcd795c6-708f-47ba-8dc7-faf254011adb" containerID="f491a97f3588be239c4c5b220ec2d9ea4a68032e7052d32b99d0579038c4efca" exitCode=0 Jan 29 04:45:04 crc kubenswrapper[4707]: I0129 04:45:04.016823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rfvr" event={"ID":"fcd795c6-708f-47ba-8dc7-faf254011adb","Type":"ContainerDied","Data":"f491a97f3588be239c4c5b220ec2d9ea4a68032e7052d32b99d0579038c4efca"} Jan 29 04:45:04 crc kubenswrapper[4707]: I0129 04:45:04.019367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" event={"ID":"8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4","Type":"ContainerDied","Data":"aa285d2c1cad21cb39650884aac5df4eaa556fa3780a631866b0f9f55264a5e3"} Jan 29 04:45:04 crc kubenswrapper[4707]: I0129 04:45:04.019487 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa285d2c1cad21cb39650884aac5df4eaa556fa3780a631866b0f9f55264a5e3" Jan 29 04:45:04 crc kubenswrapper[4707]: I0129 04:45:04.019437 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494365-zw2qr" Jan 29 04:45:04 crc kubenswrapper[4707]: I0129 04:45:04.481247 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms"] Jan 29 04:45:04 crc kubenswrapper[4707]: I0129 04:45:04.489751 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494320-sktms"] Jan 29 04:45:05 crc kubenswrapper[4707]: I0129 04:45:05.043166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rfvr" event={"ID":"fcd795c6-708f-47ba-8dc7-faf254011adb","Type":"ContainerStarted","Data":"b76c9c1ba86d63eb86d5339a52b912ceba7af4ba22e3a92a413997d881005762"} Jan 29 04:45:05 crc kubenswrapper[4707]: I0129 04:45:05.083832 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4rfvr" podStartSLOduration=2.641748437 podStartE2EDuration="6.083810782s" podCreationTimestamp="2026-01-29 04:44:59 +0000 UTC" firstStartedPulling="2026-01-29 04:45:00.965913194 +0000 UTC m=+4654.450142099" lastFinishedPulling="2026-01-29 04:45:04.407975539 +0000 UTC m=+4657.892204444" observedRunningTime="2026-01-29 04:45:05.077404563 +0000 UTC m=+4658.561633468" watchObservedRunningTime="2026-01-29 04:45:05.083810782 +0000 UTC m=+4658.568039697" Jan 29 04:45:05 crc kubenswrapper[4707]: I0129 04:45:05.259615 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5954036d-8bd6-4b27-9156-75fdc0744f98" path="/var/lib/kubelet/pods/5954036d-8bd6-4b27-9156-75fdc0744f98/volumes" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.612073 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s6p94"] Jan 29 04:45:08 crc kubenswrapper[4707]: E0129 04:45:08.615761 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4" containerName="collect-profiles" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.616241 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4" containerName="collect-profiles" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.616812 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f3a9887-76e9-4f98-aeb4-3c8e266f4bf4" containerName="collect-profiles" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.618808 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.637640 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6p94"] Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.725950 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvz5w\" (UniqueName: \"kubernetes.io/projected/31ee20f0-b4fd-4287-a964-3ae8722595ae-kube-api-access-nvz5w\") pod \"community-operators-s6p94\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.726047 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-utilities\") pod \"community-operators-s6p94\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.726185 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-catalog-content\") pod \"community-operators-s6p94\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.828746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvz5w\" (UniqueName: \"kubernetes.io/projected/31ee20f0-b4fd-4287-a964-3ae8722595ae-kube-api-access-nvz5w\") pod \"community-operators-s6p94\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.828834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-utilities\") pod \"community-operators-s6p94\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.828911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-catalog-content\") pod \"community-operators-s6p94\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.829689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-catalog-content\") pod \"community-operators-s6p94\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.830000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-utilities\") pod \"community-operators-s6p94\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.856143 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvz5w\" (UniqueName: \"kubernetes.io/projected/31ee20f0-b4fd-4287-a964-3ae8722595ae-kube-api-access-nvz5w\") pod \"community-operators-s6p94\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:08 crc kubenswrapper[4707]: I0129 04:45:08.959103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:09 crc kubenswrapper[4707]: I0129 04:45:09.505822 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6p94"] Jan 29 04:45:09 crc kubenswrapper[4707]: I0129 04:45:09.740470 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:45:09 crc kubenswrapper[4707]: I0129 04:45:09.741130 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:45:09 crc kubenswrapper[4707]: I0129 04:45:09.830151 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:45:10 crc kubenswrapper[4707]: I0129 04:45:10.109147 4707 generic.go:334] "Generic (PLEG): container finished" podID="31ee20f0-b4fd-4287-a964-3ae8722595ae" containerID="ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0" exitCode=0 Jan 29 04:45:10 crc kubenswrapper[4707]: I0129 04:45:10.109286 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6p94" event={"ID":"31ee20f0-b4fd-4287-a964-3ae8722595ae","Type":"ContainerDied","Data":"ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0"} Jan 29 04:45:10 crc kubenswrapper[4707]: I0129 04:45:10.109370 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6p94" event={"ID":"31ee20f0-b4fd-4287-a964-3ae8722595ae","Type":"ContainerStarted","Data":"b9238f5b61516b64222d0dbc5622f4e941b919359a6aa9fd28f5145d35f685b4"} Jan 29 04:45:10 crc kubenswrapper[4707]: I0129 04:45:10.194216 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:45:11 crc kubenswrapper[4707]: I0129 04:45:11.121054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6p94" event={"ID":"31ee20f0-b4fd-4287-a964-3ae8722595ae","Type":"ContainerStarted","Data":"5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20"} Jan 29 04:45:12 crc kubenswrapper[4707]: I0129 04:45:12.176828 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rfvr"] Jan 29 04:45:12 crc kubenswrapper[4707]: I0129 04:45:12.177473 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4rfvr" podUID="fcd795c6-708f-47ba-8dc7-faf254011adb" containerName="registry-server" containerID="cri-o://b76c9c1ba86d63eb86d5339a52b912ceba7af4ba22e3a92a413997d881005762" gracePeriod=2 Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.146263 4707 generic.go:334] "Generic (PLEG): container finished" podID="31ee20f0-b4fd-4287-a964-3ae8722595ae" containerID="5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20" exitCode=0 Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.146352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6p94" event={"ID":"31ee20f0-b4fd-4287-a964-3ae8722595ae","Type":"ContainerDied","Data":"5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20"} Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.153352 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcd795c6-708f-47ba-8dc7-faf254011adb" containerID="b76c9c1ba86d63eb86d5339a52b912ceba7af4ba22e3a92a413997d881005762" exitCode=0 Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.153411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rfvr" event={"ID":"fcd795c6-708f-47ba-8dc7-faf254011adb","Type":"ContainerDied","Data":"b76c9c1ba86d63eb86d5339a52b912ceba7af4ba22e3a92a413997d881005762"} Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.153454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rfvr" event={"ID":"fcd795c6-708f-47ba-8dc7-faf254011adb","Type":"ContainerDied","Data":"c0628da78c2af691e2d8e8669d97545ab0ce74b9f9ff6f87e60db97dd27bcca1"} Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.153479 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0628da78c2af691e2d8e8669d97545ab0ce74b9f9ff6f87e60db97dd27bcca1" Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.182923 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.265737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-catalog-content\") pod \"fcd795c6-708f-47ba-8dc7-faf254011adb\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.265796 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bs2v\" (UniqueName: \"kubernetes.io/projected/fcd795c6-708f-47ba-8dc7-faf254011adb-kube-api-access-8bs2v\") pod \"fcd795c6-708f-47ba-8dc7-faf254011adb\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.265920 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-utilities\") pod \"fcd795c6-708f-47ba-8dc7-faf254011adb\" (UID: \"fcd795c6-708f-47ba-8dc7-faf254011adb\") " Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.279083 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-utilities" (OuterVolumeSpecName: "utilities") pod "fcd795c6-708f-47ba-8dc7-faf254011adb" (UID: "fcd795c6-708f-47ba-8dc7-faf254011adb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.294516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd795c6-708f-47ba-8dc7-faf254011adb-kube-api-access-8bs2v" (OuterVolumeSpecName: "kube-api-access-8bs2v") pod "fcd795c6-708f-47ba-8dc7-faf254011adb" (UID: "fcd795c6-708f-47ba-8dc7-faf254011adb"). InnerVolumeSpecName "kube-api-access-8bs2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.337317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcd795c6-708f-47ba-8dc7-faf254011adb" (UID: "fcd795c6-708f-47ba-8dc7-faf254011adb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.370654 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.370705 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcd795c6-708f-47ba-8dc7-faf254011adb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:45:13 crc kubenswrapper[4707]: I0129 04:45:13.370723 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bs2v\" (UniqueName: \"kubernetes.io/projected/fcd795c6-708f-47ba-8dc7-faf254011adb-kube-api-access-8bs2v\") on node \"crc\" DevicePath \"\"" Jan 29 04:45:14 crc kubenswrapper[4707]: I0129 04:45:14.164019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6p94" event={"ID":"31ee20f0-b4fd-4287-a964-3ae8722595ae","Type":"ContainerStarted","Data":"2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53"} Jan 29 04:45:14 crc kubenswrapper[4707]: I0129 04:45:14.164074 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rfvr" Jan 29 04:45:14 crc kubenswrapper[4707]: I0129 04:45:14.194305 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s6p94" podStartSLOduration=2.75574558 podStartE2EDuration="6.194279737s" podCreationTimestamp="2026-01-29 04:45:08 +0000 UTC" firstStartedPulling="2026-01-29 04:45:10.112737769 +0000 UTC m=+4663.596966714" lastFinishedPulling="2026-01-29 04:45:13.551271966 +0000 UTC m=+4667.035500871" observedRunningTime="2026-01-29 04:45:14.183868025 +0000 UTC m=+4667.668096930" watchObservedRunningTime="2026-01-29 04:45:14.194279737 +0000 UTC m=+4667.678508642" Jan 29 04:45:14 crc kubenswrapper[4707]: I0129 04:45:14.208879 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rfvr"] Jan 29 04:45:14 crc kubenswrapper[4707]: I0129 04:45:14.219916 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4rfvr"] Jan 29 04:45:15 crc kubenswrapper[4707]: I0129 04:45:15.256703 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd795c6-708f-47ba-8dc7-faf254011adb" path="/var/lib/kubelet/pods/fcd795c6-708f-47ba-8dc7-faf254011adb/volumes" Jan 29 04:45:18 crc kubenswrapper[4707]: I0129 04:45:18.960123 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:18 crc kubenswrapper[4707]: I0129 04:45:18.960748 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:19 crc kubenswrapper[4707]: I0129 04:45:19.005765 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:19 crc kubenswrapper[4707]: I0129 04:45:19.259784 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:20 crc kubenswrapper[4707]: I0129 04:45:20.176916 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6p94"] Jan 29 04:45:21 crc kubenswrapper[4707]: I0129 04:45:21.231034 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s6p94" podUID="31ee20f0-b4fd-4287-a964-3ae8722595ae" containerName="registry-server" containerID="cri-o://2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53" gracePeriod=2 Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.219818 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.245897 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6p94" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.245924 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6p94" event={"ID":"31ee20f0-b4fd-4287-a964-3ae8722595ae","Type":"ContainerDied","Data":"2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53"} Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.245969 4707 scope.go:117] "RemoveContainer" containerID="2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.245893 4707 generic.go:334] "Generic (PLEG): container finished" podID="31ee20f0-b4fd-4287-a964-3ae8722595ae" containerID="2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53" exitCode=0 Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.246192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6p94" event={"ID":"31ee20f0-b4fd-4287-a964-3ae8722595ae","Type":"ContainerDied","Data":"b9238f5b61516b64222d0dbc5622f4e941b919359a6aa9fd28f5145d35f685b4"} Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.276842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-utilities\") pod \"31ee20f0-b4fd-4287-a964-3ae8722595ae\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.277144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-catalog-content\") pod \"31ee20f0-b4fd-4287-a964-3ae8722595ae\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.277233 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvz5w\" (UniqueName: \"kubernetes.io/projected/31ee20f0-b4fd-4287-a964-3ae8722595ae-kube-api-access-nvz5w\") pod \"31ee20f0-b4fd-4287-a964-3ae8722595ae\" (UID: \"31ee20f0-b4fd-4287-a964-3ae8722595ae\") " Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.281912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-utilities" (OuterVolumeSpecName: "utilities") pod "31ee20f0-b4fd-4287-a964-3ae8722595ae" (UID: "31ee20f0-b4fd-4287-a964-3ae8722595ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.299121 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ee20f0-b4fd-4287-a964-3ae8722595ae-kube-api-access-nvz5w" (OuterVolumeSpecName: "kube-api-access-nvz5w") pod "31ee20f0-b4fd-4287-a964-3ae8722595ae" (UID: "31ee20f0-b4fd-4287-a964-3ae8722595ae"). InnerVolumeSpecName "kube-api-access-nvz5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.307722 4707 scope.go:117] "RemoveContainer" containerID="5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.351835 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31ee20f0-b4fd-4287-a964-3ae8722595ae" (UID: "31ee20f0-b4fd-4287-a964-3ae8722595ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.357469 4707 scope.go:117] "RemoveContainer" containerID="ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.380181 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.380216 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvz5w\" (UniqueName: \"kubernetes.io/projected/31ee20f0-b4fd-4287-a964-3ae8722595ae-kube-api-access-nvz5w\") on node \"crc\" DevicePath \"\"" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.380232 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee20f0-b4fd-4287-a964-3ae8722595ae-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.399173 4707 scope.go:117] "RemoveContainer" containerID="2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53" Jan 29 04:45:22 crc kubenswrapper[4707]: E0129 04:45:22.399816 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53\": container with ID starting with 2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53 not found: ID does not exist" containerID="2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.399854 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53"} err="failed to get container status \"2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53\": rpc error: code = NotFound desc = could not find container \"2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53\": container with ID starting with 2fd80e8732f1e53e73135d55d94410f2be0c82495c931ce914687903f5f8df53 not found: ID does not exist" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.399878 4707 scope.go:117] "RemoveContainer" containerID="5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20" Jan 29 04:45:22 crc kubenswrapper[4707]: E0129 04:45:22.400461 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20\": container with ID starting with 5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20 not found: ID does not exist" containerID="5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.400615 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20"} err="failed to get container status \"5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20\": rpc error: code = NotFound desc = could not find container \"5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20\": container with ID starting with 5720ce10ab40b1a476ed616c74819ed679bb2e41db0af0280d84b68ff35fcc20 not found: ID does not exist" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.400703 4707 scope.go:117] "RemoveContainer" containerID="ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0" Jan 29 04:45:22 crc kubenswrapper[4707]: E0129 04:45:22.401095 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0\": container with ID starting with ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0 not found: ID does not exist" containerID="ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.401118 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0"} err="failed to get container status \"ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0\": rpc error: code = NotFound desc = could not find container \"ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0\": container with ID starting with ee9a1457d8cab201340eec1c6c2151985913e068a038cefcfffa5345fb4c78f0 not found: ID does not exist" Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.589349 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6p94"] Jan 29 04:45:22 crc kubenswrapper[4707]: I0129 04:45:22.598274 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s6p94"] Jan 29 04:45:23 crc kubenswrapper[4707]: I0129 04:45:23.260499 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31ee20f0-b4fd-4287-a964-3ae8722595ae" path="/var/lib/kubelet/pods/31ee20f0-b4fd-4287-a964-3ae8722595ae/volumes" Jan 29 04:45:42 crc kubenswrapper[4707]: I0129 04:45:42.404385 4707 scope.go:117] "RemoveContainer" containerID="68d5a0dd3f54f372e12af3ecd9080a6d2563496a6434ebcef69a8b9a02a5cef6" Jan 29 04:46:33 crc kubenswrapper[4707]: I0129 04:46:33.462962 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:46:33 crc kubenswrapper[4707]: I0129 04:46:33.463524 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 04:47:03 crc kubenswrapper[4707]: I0129 04:47:03.462904 4707 patch_prober.go:28] interesting pod/machine-config-daemon-hbz9l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 04:47:03 crc kubenswrapper[4707]: I0129 04:47:03.463637 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hbz9l" podUID="df12d101-b13d-4276-94b7-422c6609d2e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515136563135024455 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015136563135017372 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015136551327016515 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015136551330015457 5ustar corecore